The present invention relates to registers in general and in particular to an optical based register.
Indicia reading terminals are available in multiple varieties. The well known gun style reader as commonly seen at retail store checkout counters is typically available in a form devoid of a keyboard and display. Enhanced functioning indicia reading terminals having keyboards displays and advanced networking communication capabilities are also available. Typically, indicia reading terminals have triggers for activating decoding attempts. Whatever the variety, users of such indicia reading terminals desire snappiness of operation. A terminal's trigger to read (TTR) time is a measure of a delay between the time a trigger is actuated for initiating a decode attempt and a time a decoded message is output.
It has been observed that long trigger to read times occur when a terminal consumes time attempting to decode frames of poor quality, the poor frame quality resulting from the frame being devoid of a decodable indicia representation or is of otherwise insufficient quality due to e.g., poor illumination, poor focusing, and/or hand jitter to permit decoding. In one instance, a frame of image data devoid of a decodable indicia representation and therefore of low quality can be processed in accordance with a decoding application for a period of up to tens of frame times until a timeout period is reached without there being output a decoded message. In another instance, a frame including a decodable indicia, but of insufficient quality to allow for decoding, can be subject to processing in accordance with a decoding application for a period of up to tens of frame times until a timeout period is reached without there being output a decoded message.
There exists a need to improve TTR times for indicia reading terminals.
There is described an indicia reading terminal that can be operative to capture a succession of frames of image data and that can be operative so that a certain frame of the succession of frames is subject to quality evaluation processing where a result of the quality evaluation processing is responsive to one or more of an incidence and sharpness of edge representations of the frame of image data.
The features described herein can be better understood with reference to the drawings described below. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the drawings, like numerals are used to indicate like parts throughout the various views.
[Beginning of excerpted sections from referenced U.S. patent application Ser. No. 12/242,244]:
[End of excerpt from referenced U.S. patent application Ser. No. 12/242,244.]
There is described an indicia reading terminal that can be operative to capture a succession of frames of image data and that can be operative so that a certain frame of the succession of frames is subject to quality evaluation processing where a result of the quality evaluation processing is responsive to one or more of an incidence and sharpness of edge representations of the frame of image data.
In one embodiment, an indicia reading terminal can be operative so that operation of the indicia reading terminal is responsive to a result of the frame quality evaluation processing. In one example, an indicia reading terminal can be operative so that an indicia reading terminal ceases attempting to decode utilizing a previous frame in response to the result of the processing. In another example, a decoding application for decoding of a frame of image data can be changed from an inactive state to an active state in response to a result of the processing. In one embodiment, frame quality evaluation processing can be carried out by a central processing unit (CPU). In one embodiment, a time period for frame quality evaluation processing by the CPU can be restricted from consuming more than a predetermined processing period. In one example, the CPU, for frame quality evaluation processing can be restricted from consuming a time period of more than a frame time, and in one particular example, the CPU, for frame quality evaluation processing can be restricted from consuming a time period of more than a predetermined fraction of a frame time.
A functional block drawing illustrating an indicia reading terminal is shown in
An exemplary hardware platform for carrying out the described method is shown and described with reference to the block diagram of
In the course of operation of terminal 1000 image signals can be read out of image sensor 1032, converted and stored into a system memory such as RAM 1080. A memory 1085 of terminal 1000 can include RAM 1080, a nonvolatile memory such as EPROM 1082 and a storage memory device 1084 such as may be provided by a flash memory or a hard drive memory. In one embodiment, terminal 1000 can include CPU 1060 which can be adapted to read out image data stored in memory 1080 and subject such image data to various image processing algorithms. Terminal 1000 can include a direct memory access unit (DMA) 1070 for routing image information read out from image sensor 1032 that has been subject to conversion to RAM 1080. In another embodiment, terminal 1000 can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 1032 and RAM 1080 are within the scope and the spirit of the invention.
Referring to further aspects of terminal 1000, terminal 1000 can include an imaging lens assembly 1110 for focusing an image of a decodable indicia located within a field of view 40 on a substrate 50 onto image sensor array 1033. Imaging light rays can be transmitted about imaging axis 25. Lens assembly 1110 can be adapted to be capable of multiple focal lengths and multiple best focus distances.
Terminal 1000 can also include an illumination pattern light source bank 1204 for generating an illumination pattern 60 substantially corresponding to a field of view 40 of terminal 1000 and an aiming pattern light source bank 1208 for generating an aiming pattern 70 on substrate 50. In use, terminal 1000 can be oriented by an operator with respect to a substrate 50 bearing decodable indicia 15 in such manner that aiming pattern 70 is projected on a decodable indicia 15. In the example of
Terminal 1000 can also include a number of peripheral devices such as display 1304 for displaying such information as image frames captured with use of terminal 1000, keyboard 1404, pointing device 1406, and trigger 1408 which may be used to make active a trigger signal 702 for activating frame readout and/or certain decoding processes. Terminal 1000 can be adapted so that activation of trigger 1408 activates trigger signal 702 and initiates a decode attempt.
Terminal 1000 can include various interface circuits for coupling various of the peripheral devices to system address/data bus (system bus) 1500, for communication with CPU 1060 also coupled to system bus 1500. Terminal 1000 can include interface circuit 1028 for coupling image sensor timing and control circuit 1038 to system bus 1500, interface circuit 1118 for coupling lens assembly control circuit 1120 to system bus 1500, interface circuit 1218 for coupling illumination assembly control circuit 1220 to system bus 1500, interface circuit 1302 for coupling display 1304 to system bus 1500, and interface circuit 1402 for coupling keyboard 1404, pointing device 1406, and trigger 1408 to system bus 1500.
In a further aspect, terminal 1000 can include one or more I/O interfaces 1604, 1606 for providing communication with external devices (e.g., a cash register server, a store server, an inventory facility server, a peer terminal 1000, a local area network base station, a cellular base station). I/O interfaces 1604, 1606 can be interfaces of any combination of known computer interfaces, e.g., Ethernet (IEEE 802.3), USB, IEEE 802.11, Bluetooth, CDMA, GSM.
A succession of frames of image data that can be captured and subject to the described processing can be full frames (including pixel values corresponding to more than about 80% of pixels of image sensor 1032). A succession of frames of image data that can be captured and subject to the described processing (e.g., frame quality evaluation processing) can also be “windowed frames” comprising pixel values corresponding to less than about 80%, and in some cases less than about 50% and in some cases less than 10% of pixels of image sensor 1032. A succession of frames of image data that can be captured and subject to the described processing can also comprise a combination of full frames and windowed frames. A full frame can be captured by selectively addressing for readout pixels of image sensor 1032 corresponding to the full frame. A windowed frame can be captured by selectively addressing for readout pixels of image sensor 1032 corresponding to the windowed frame.
Terminal 1000 can capture frames of image data at a rate known as a frame rate. A typical frame rate is 60 frames per second (FPS) which translates to a frame time (frame period) of 16.6 ms. Another typical frame rate is 30 frames per second (FPS) which translates to a frame time (frame period) of 33.3 ms per frame.
Referring to
An example of an indicia reading terminal 1000 operating in accordance with described processing is described with reference to the timing diagram of
Referring to the timing diagram of
Referring to the embodiment described with reference to the timing diagram of
In the specific example of
It should be noted that when switching to decoding a new frame (i.e., the switch from frame=frame0 to frame=frame2) terminal 1000 may not discard the results of decoding the previous frame. For example, in some instances, a decodable indicia subject to decoding can be a bar code of a symbology type that can be decodable to output code words. Code words of a bar code symbol are not complete decoded messages of a bar code symbol but can be combined with other code words of a bar code symbol to provide a complete decoded message. A decoded code word of a bar code symbol may be regarded as a partially decoded message. Symbologies which may be decoded to provide code words representing a partial decoded message of a bar code symbol include PDF 417, UPC, Datamatrix, QR code, and Aztec, etc. Terminal 1000 can be operative to accumulate partially decoded messages determined by processing a set of subject frames until a decoded message for a symbol is determined.
With reference to the example of
In some embodiments, a decision to cease decoding processing of a certain frame can be responsive to a decode status. As indicated in
For decoding bar code decodable indicia of certain symbologies, CPU 1060 can be adapted to combine partial decoded out results determined from two or more different frames. A partial decode result provided by decoding a frame of image data can take the form of a set of code words. CPU 1060 can be adapted to determine a first set of code words by processing a certain frame of a set of frames while a trigger signal 702 is active and to combine the first set of code words with a second set of code words determined by processing of a subsequent frame while the trigger signal 702 remains active. In one embodiment, CPU 1060 can be adapted so that CPU 1060 can process a certain frame to determine a first set of code words, a subsequent frame to provide a second set of code words, and possibly M further subsequent frames to provide a third set of code words. CPU 1060 can further be adapted to combine the first, second, and possible M additional sets of code words to provide a decoded message. For example, with reference to the timing diagram of
In one embodiment, the processing at periods 720, 721, 722, 723, 724 for image quality evaluation can be restricted from consuming more than a predetermined period of time. In one embodiment, the predetermined time period is a time period of less than one frame period. In such manner, CPU 1060 is assured of completing quality evaluation for a certain frame, frame=framej prior to a time that a successive frame, frame=framej+1, is available for processing by CPU 1060.
In one specific embodiment, with reference to time plot 714 of
Another illustrative example of a described processing is described with reference to
Variations of processing for quality evaluation and for decoding are now described. In one example, for evaluating a frame of image data for quality, CPU 1060 can apply a filter set to a frame of image data. For example, the edgelet detector filter set below can be applied.
Terminal 1000 can convolve each pixel value of a sample of pixel values with each of the above edgelets to determine an edge strength score for each pixel position of a frame. Edge strength scores for pixel positions of a sample can be summed for determining a quality score for a frame. Other calculations can be applied utilizing edge strength scores to determine alternative quality score statistics. In another aspect, texture filters need not be applied by CPU 1060 with reference to stored frames of image data that are stored in a CPU addressable memory. For example, as shown in
In another example, quality evaluation processing can comprise sampling a frame of image data along a plurality of sampling paths and calculating autocorrelation scores for each of the sampling paths. In one example, sampling paths 802, 804, 806, 808, 810, 812, 814, 816, 818, 830, 832, 834, 836, 838, 840, 842, 844, 846 can be selected, as are indicated in
Spath=Σ(In−In−1)2 Equation 1
where In is the pixel value at a certain pixel position n, of a path, and In−1 is a pixel value at a pixel position adjacent to the nth pixel position. For reduction of clock cycles required for performing the calculation of Equation 1, an approximation of the result of Equation 1 can be carried out by executing the calculation:
Spath=Σ|In−In−1| Equation 2
Further according to a process for evaluating a quality of a frame of image data, a quality score for a frame of image data can be determined utilizing autocorrelation scores for the paths 802, 804, 806, 808, 810, 812, 814, 816, 818, 830, 832, 834, 836, 838, 840, 842, 844, 846. In one example, a sum of autocorrelation scores for a frame can be taken as a measure of a quality of a frame.
Using either of the described methods including application of edgelet filters or including calculating autocorrelation scores for sampling paths, it is seen that either of an increased incidence of or sharpness of edge representations in a time frame will result in an increased quality score for a frame.
Referring now to processes that can be carried out by processing module 20 during, e.g., periods 730, 732 (
Where a decodable indicia representation is a 2D bar code symbology, a decode attempt can comprise the steps of locating a finder pattern using a feature detection algorithm, locating scan lines intersecting the finder pattern according to a predetermined relationship with the finder pattern, determining a pattern of dark and light cells along the scan lines, and converting each light pattern into a character or character string via table lookup.
In one embodiment, terminal 1000 can incorporate a multi-tasking operating system, and CPU 1060 can be programmed to contemporaneously execute, as part of separate processing threads, the described (a) frame quality evaluation processing and (b) decoding processing. However, whereas terminal 1000 can be operative so that frame quality evaluation processing is restricted from consuming more than one frame time, decode attempt (decoding) processing for attempting to extract a decoded message can, in one embodiment, consume more than one frame time. Accordingly, at a time CPU 1060 commences processing of a certain frame, frame=framej for frame quality evaluation, CPU 1060 may be processing a previous frame, framej−k, k≧1 for decoding. Referring to the timing diagram of
In another exemplary embodiment described with reference to
In one embodiment, frame quality evaluation processing can be utilized when terminal 1000 is controlled for selectively outputting a frame of image data. A frame can be selectively output e.g., by storing a frame to an onboard or external non-volatile memory and/or by displaying a frame to an onboard or external display. From time to time terminal 1000 can be controlled to selectively output a frame for display. For example, an operator may wish to subject to persistent storage or display, e.g., a frame representing a package or other article being shipped, an article bearing a decodable indicia, a person such as a delivery agent, or package recipient, or a customer at a point of sale, a shipping truck, etc. Referring to the timing diagram of
In one embodiment, terminal 1000 can be operative so that in response to a picture taking signal 704 being made active, a predetermined number of frames e.g., 10 frames are captured. Terminal 1000 can also be operative so that in response to picture taking signal 704 being made active, terminal 1000 selects the frame out of the predetermined number of frames having the highest quality score for selective output. The inventors recognized that quality evaluation processing described herein, where quality is responsive to an incidence and/or sharpness of edge representations can be a good indicator of visual quality as well as quality for decoding purposes. Referring to the timing diagram of
In some operating modes, terminal 1000 can be made to selectively output a frame of image data responsively to a trigger signal 702 being made active for initiating decoding processing. Terminal 1000 can be adapted so that trigger signal 702 can be made active in response to an actuation of trigger 1408 or another user interface component of terminal 1000. It is seen from the timing diagram of
In one embodiment, terminal 1000 can be operative so that frame quality evaluation processing algorithms applied can be differentiated depending on whether a frame is output for display or archiving responsive to (a) trigger signal or (b) picture signal being activated. For example, when the output is in reference to trigger signal 702, a center weighted sampling path format can be used as shown in
Referring now to further aspects of terminal 1000, an additional feature of terminal 1000 in one embodiment is now described. In one embodiment, frame quality evaluation processing module 10 can input to logic module 30 a location of a decodable indicia in each frame of a succession of captured frames. Such a location can be referred to as a region of interest (ROI). Logic module 30 in turn can be configured to report the determined ROI location to decode attempt processing module 20, when logic module 30 messages decode attempt processing module 20 to commence decode processing of a frame designated for decoding (a designated decode frame). Accordingly, for a certain frame of a succession of frames, frame quality evaluation processing module 10 can determine a ROI for the certain frame, and decode attempt processing module 20 can utilize the ROI for attempting to decode the certain frame. In such manner, the speed of processing for decoding of a certain frame can be significantly increased as the processing stage of searching can be avoided. The manner in which CPU 1060 can determine a location of decodable indicia as part of a frame quality processing is now described. It is noted that as part of determining a quality score for a frame, CPU 1060 can calculate autocorrelation scores for each sampling path. In another aspect, the sampling path or paths yielding the highest autocorrelation score(s) are the path(s) including a representation of a decodable indicia. The path yielding the highest autocorrelation scores will generally be the paths intersecting the greatest number of (and/or the sharpest) edge representations.
Accordingly, in one specific example, a location of a decodable indicia in a frame of image data can be determined, as part of a frame quality evaluation processing described herein, by subjecting a plurality of horizontal and vertical sampling paths to an autocorrelation function, selecting the sampling paths yielding the highest autocorrelation score(s) as the paths representing a decodable indicia, and in one embodiment, further processing the path data as described herein. In one embodiment, CPU 1060 after calculating autocorrelation scores for various sampling paths can further process path data for further discrimination of a location of a decodable indicia representation along a path. In one embodiment, selected paths selected as yielding the highest autodiscrimination scores and indicating an intersection of decodable indicia representations can be subjected to processing for classifying path image data as either background image data or decodable indicia image data. In one example, for performance of such classification, terminal 1000 can binarize image data along each selected sampling path. For binarization, path pixel value image data can be compared to a threshold and all pixel values below the threshold can be classified as decodable indicia, and all pixel values above the threshold can be classified as background. A mean value of a path can be selected as a threshold, or the threshold can be selected according to T=½(Imax+Imin). In another example, Otsu's algorithm can be utilized for determination of the threshold. Segments of a sampling path characterized by a high density of decodable indicia can be classified as decodable indicia segments. Image data at a location determined to include a decodable indicia representation can include image data defining dark portions of a decodable indicia and image data at pixel positions about such indicia.
An indicia reading terminal configured as described herein is particularly well suited for decoding a decodable indicia disposed on a vibrating substrate. For example, in one commonly encountered type of scene substrate 50 as shown in
A hand held indicia reading terminal having an image sensor and an imaging lens for use in focusing images onto the image sensor was configured to subject a succession of frames to the frame quality processing described herein so that when a trigger signal is activated the terminal would subject a succession of frames to frame quality evaluation processing and cease subjecting to a decode attempt a previous frame if a current frame has a quality score that exceeds that of the previous frame by a predetermined threshold. The hand held indicia reading terminal was directed at a vibrating substrate that bears a UPC symbol so that the field of view of the terminal includes the UPC symbol. The terminal was held at a constant distance from the vibrating substrate within 3 inches of a best focus distance of the terminal. The UPC symbol was illuminated at about 120 lux. The substrate was vibrated at 890 rpm horizontally with an amplitude of about ⅝ of an inch. The indicia reading terminal was held at a constant distance from the vibrating substrate. A trigger signal was successively activated ten times by successive actuations of the indicia reading terminal's trigger. The UPC symbol was successfully decoded ten (10) times within eight (8) seconds. The terminal was modified so that the frame quality evaluation processing described herein was disabled. A trigger signal was successively activated ten (10) times by successive activations of the indicia reading terminal's trigger. Thirty-eight (38) seconds were consumed before the indicia reading terminal successfully decoded the symbol ten times.
The experiment was repeated disposing a QR Code symbol on the vibrating substrate, with remaining controls maintained as described in Example 1. In response to ten (10) successive trigger signal activations, the QR Code symbol was decoded ten (10) times within fifteen (15) seconds. The experiment was repeated disabling frame quality evaluation processing as described herein. A trigger signal was successively activated ten times by successive actuation of the terminal's trigger. Decode times for decoding the QR Code symbol ten times ranged from 48.1 seconds to 99.8 seconds.
In one embodiment, terminal 1000 can be operative to incorporate, in addition to or as an alternative to the processing functionality described herein above. In another embodiment, terminal 1000 can be operative to incorporate, in addition to or as an alternative to the processing functionality described in co-pending U.S. patent application Ser. No. 12/242,244, entitled “Method And Apparatus For Operating Indicia Reading Terminal Including Parameter Determination” filed Sep. 30, 2008 and incorporated herein by reference in its entirety. In one embodiment, the context determination processing, the quality determination processing, and parameter determination processing described in the referenced U.S. patent application Ser. No. 12/242,244, which can be carried out on a possibly time-restricted, and per frame basis for a succession of frames, can be carried out as part of the described possibly per frame and possibly time restricted frame quality evaluation processing as described herein above. Accordingly, in one example, terminal 1000 can be adapted so that terminal 1000 within a time restricted time for each of a succession of frames, can carry out each of quality evaluation processing and ROI determination as described herein as well as context detection processing and parameter determination processing as described in referenced U.S. patent application Ser. No. 12/242,244. Excerpts referenced in U.S. patent application Ser. No. 12/242,244 are presented herein below with figure enumerations changed to avoid duplication.
[Beginning of excerpted sections of referenced U.S. patent application Ser. No. 12/242,244]:
Referring to the flow diagram of
A generic description of a method having been described with reference to the flow diagram of
A specific example of step 100 (processing image information for determining a location of a decodable indicia) is described with reference to the timing diagram of
At block 100, terminal 1000 in one embodiment, can process image information for determining a location of decodable indicia by completing capturing of a frame of image data by storage of a frame into RAM 1080, where it is addressable by CPU 1060 and subjecting the stored frame of image data to processing. An exemplary method for capturing image data for processing by CPU 1060 is described with reference to the timing diagram of
Following each exposure period Exp0, Exp1, Exp2, Exp3 . . . image information in the form of voltage signals can be read out from image sensor 1032. The readout of image information from image sensor 1032 can be in response to applications of readout control pulses of readout control signal 2120 as shown in the timing diagram of
A succession of frames of image data that can be captured and subject to the described processing can be full frames (including pixel values corresponding to more than about 80% of pixels of image sensor 1032). A succession of frames of image data that can be captured and subject to the described processing (e.g., context processing, parameter determination, decoding) can also be “windowed frames” comprising pixel values corresponding to less than about 80%, and in some cases less than about 50% and in some cases less than 10% of pixels of image sensor 1032. A succession of frames of image data that can be captured and subject to the described processing can also comprise a combination of full frames and windowed frames. A full frame can be captured by selectively addressing for readout pixels of image sensor 1032 corresponding to the full frame. A windowed frame can be captured by selectively addressing for readout pixels of image sensor 1032 corresponding to the windowed frame.
Terminal 1000 can capture frames of image data at a rate known as a frame rate. A typical frame rate is 60 frames per second (FPS) which translates to a frame time (frame period) of 16.6 ms. Another typical frame rate is 30 frames per second (FPS) which translates to a frame time (frame period) of 33.3 ms per frame.
After a frame of image data is captured at block 100 by storage into RAM 1080, the frame of image data can be subject to context detection for determining a location of a decodable indicia representation. At block 100, for context detection, a frame of image data can be sampled along a plurality of sampling paths. A representation of a frame of image data is shown in
Further, for processing a frame of image data for context detection, an autocorrelation function can be applied to sampled image data. In one example, an autocorrelation function can be applied utilizing image data of the sampling path. An autocorrelation function for a sampling path can comprise the formula
Spath=Σ(In−In−1)2 Equation 1
where In is the pixel value at a certain pixel position n, of a path, and In−1 is a pixel value at a pixel position adjacent to the nth pixel position. For reduction of clock cycles required for performing the calculation of Equation 1, an approximation of the result of Equation 1 can be carried out by executing the calculation:
Spath=Σ|In−In−1| Equation 2
From the formulas of Equation 1 and Equation 2, it is seen that sampling paths that intersect representations of decodable indicia will likely include higher autocorrelation scores than those that do not include decodable indicia representations (where no decodable indicia representation is included along a sampling path and the sampling path comprises similar valued pixel values, the autocorrelation function will be relatively low; however, high autocorrelation scores will result when a sampling path intersects a decodable indicia representation including a plurality of dark/light transitions). Accordingly, a high autocorrelation score for a sampling path will serve as an indication that a sampling path is likely to have intersected a decodable indicia representation.
Further at block 100, high autocorrelation scores (those indicating the inclusion of a decodable indicia representation) can be discriminated from low autocorrelation scores (those indicating the absence of a decodable indicia representation) along a sampling path. For performance of such discrimination, the sampling path or paths yielding the highest autocorrelation score(s) can be selected as the path or paths indicating that a decodable indicia is represented. For example, the horizontal sampling path yielding the highest autocorrelation score in the horizontal direction, and the vertical sampling path yielding the highest autocorrelation score in the vertical direction can be selected as paths that indicate that a decodable indicia is represented. Also, autocorrelation scores can be subject to a threshold. Those scores above a threshold can be regarded as indicating that image data utilized for the calculation includes a decodable indicia representation and those under a threshold can be regarded as indicating that image data utilized for the calculation does not include a decodable indicia representation. Such a threshold can be predetermined and fixed or can be variable and dynamically determined.
Accordingly, in one specific example described with reference to the flow diagram of
In the example of
In one embodiment, CPU 1060 after calculating autocorrelation scores for various sampling paths can further process path data for further discrimination of a location of a decodable indicia representation along a path.
In one embodiment, selected paths selected as yielding the highest autodiscrimination scores and indicating an intersection of decodable indicia representation can be subjected to processing for classifying path image data as either background image data or decodable indicia image data. In one example, for performance of such classification, terminal 1000 can binarize image data along each selected sampling path. For binarization, path pixel value image data can be compared to a threshold and all pixel values below the threshold can be classified as decodable indicia, and all pixel values above the threshold can be classified as background. A mean value of a path can be selected as a threshold, or the threshold can be selected according to T=½(Imax+Imin). In another example, Otsu's algorithm can be utilized for determination of the threshold. Segments of a sampling path characterized by a high density of decodable indicia can be classified as decodable indicia segments. Image data at a location determined to include a decodable indicia representation can include image data defining dark portions of a decodable indicia and image data at pixel positions about such indicia.
Referring again to the flow diagram of
In one example, terminal 1000 at block 200 can determine an imaging parameter in the form of an exposure period parameter. In an example of processing block 200, autocorrelation scores which may be calculated at block 100 may be utilized for determining an exposure period parameter. In other embodiments, other imaging parameters, e.g., gain input into amplifier 1036, or an illumination level parameter for input to light source bank 1204 can be controlled instead of or in addition to an exposure period.
Terminal 1000 at block 200 can calculate a white level for a frame selectively utilizing pixel values from sampling areas having autocorrelation scores determined to indicate that a decodable indicia is represented. For example, terminal 1000 can selectively utilize sampling paths yielding the highest autocorrelation score or scores, or autocorrelation scores above a threshold. At block 200, terminal 1000, for purposes of calculating a white level of a frame of image data, can discard sampled pixel values other than pixel values of decodable indicia segments of sampling paths having autocorrelation scores previously determined to indicate that a decodable indicia is represented.
At block 200, terminal 1000 can also discard pixel values of sampling paths outside of decodable indicia segments of such paths. For calculation of a white level for a frame, terminal 1000 can selectively utilize pixel data of selected sampling paths within decodable indicia segments of such paths. In one example, CPU 1060 can average pixel values of selected paths with decodable indicia segments of such paths for calculation of white levels. As has been indicated, terminal 1000 can process a frame of image data for determining a quality of a frame. In one example, a white level of a frame can be utilized in determining a quality of a frame. For example, a frame can be determined to be of suitable quality for a decode attempt if a white level of the frame falls within a predetermined range.
A new exposure period parameter determined at block 200 can be calculated using the following formulas:
ΔE=K(WT−W) Equation 3
EN=EC+ΔE Equation 4
where W is the just-measured white level for a frame currently being processed calculated selectively utilizing pixel values within decodable indicia segments of sampling paths indicating that a decodable indicia is represented, WT is a target white level for a frame, K is a predetermined constant, ΔE is the change in exposure period, EN is a new exposure parameter to be applied for capture of a subsequent frame, and EC is the exposure period parameter for the current frame currently being subject to processing. It should be noted that the applied exposure parameters EC and EN may not be applied for capture of successive frames of image data. As has been indicated in connection with the timing diagram of
Exp2≠Exp3≠Expn−1≠Expn≠Expn+1
Further referring to the flow diagram of
After exposure of image sensor array 1033, image information in the form of voltage signals corresponding to charges stored at the various pixels of image sensor 1032 can be read out of image sensor array 1033 on a row by row basis. The image information can be subject to conversion by A/D converter 1037 and routed to RAM 1080 for completion of capture of a frame of image data by storage into RAM 1080. When stored in RAM 1080 image data is addressable for processing by CPU 1060. Referring again to the flow diagram of
Such processing can be carried out within a time period of less than a frame time, even using a CPU having a relatively modest clock speed (in one example, CPU 1060 can be incorporated in an MC9328MXLCVH15 microprocessor IC chip available from Freescale). In one embodiment, terminal 1000 can be adaptable so that processing periods 2126, 2128, 2130, 2132, and 2134 can be restricted from consuming a time period for processing of greater than a predetermined time, e.g., a frame time. In one specific embodiment, with reference to time plot 2122 of
Referring again to the flow diagram of
CPU 1060, appropriately programmed can carry out a decoding process for attempting to decode a frame of image data. For attempting to decode, CPU 1060 can sample image data of a captured frame of image data along a sampling path, e.g., at a center of a frame, or a coordinate location determined to include a decodable indicia representation. In one example, a sampling path selected for executing a decode attempt can be a sampling path which for a previous frame was determined to intersect a decodable indicia representation. Next, CPU 1060 can perform a second derivative edge detection to detect edges. After completing edge detection, CPU 1060 can determine data indicating widths between edges. CPU 1060 can then search for start/stop character element sequences and if found, derive element sequence characters character by character by comparing with a character set table. For certain symbologies, CPU 1060 can also perform a checksum computation. If CPU 1060 successfully determines all characters between a start/stop character sequence and successfully calculates a checksum (if applicable), CPU 1060 can output a decoded message. Where a decodable indicia representation is a 2D bar code symbology, a decode attempt can comprise the steps of locating a finder pattern using a feature detection algorithm, locating scan lines intersecting the finder pattern according to a predetermined relationship with the finder pattern, determining a pattern of dark and light cells along the scan lines, and converting each light pattern into a character or character string via table lookup.
In one embodiment, terminal 1000 can incorporate a multi-tasking operating system, and CPU 1060 can be programmed to contemporaneously execute, as part of separate processing threads, the described (a) context detection and parameter (and possibly quality determination) processing and (b) decoding processing. However, whereas terminal 1000 can be adapted so that context detection and parameter and quality determination processing is restricted from consuming more than one frame time, decoding processing for attempting to extract a decoded message can, in one embodiment, consume more than one frame time. Accordingly, at a time CPU 1060 commences processing of a certain frame, frame=framej for context detection and parameter and quality determination, CPU 1060 may be processing a previous frame, framej−k, k≧1 for decoding. Referring to the timing diagram of
In one embodiment, terminal 1000 can be adapted to avoid subjecting a frame of image data to a decode attempt unless the frame is determined to be of suitable quality for a decode attempt. Referring again to the timing diagram of
In another embodiment, CPU 1060 can be adapted so that CPU 1060 ceases decoding a frame presently being subject to decoding, and commences decoding a more recently captured frame conditionally on a result of processing of the more recent frame at blocks 100 and 200. In another embodiment, CPU 1060 can be adapted to subject each newly stored frame to a decode attempt. Also, terminal 1000 can be programmed so that after imaging parameters are developed pursuant to context detection having certain characteristics, an acquired frame acquired using the imaging parameter is automatically subject to decoding without being subject to context detection. Accordingly, it is observed that processing after block 300 can include one or more of (a) subjecting the subsequent frame to context detection processing and/or parameter determination for utilization in capture of a subsequent frame and/or determining frame quality; and (b) subjecting the subsequent frame to a decode attempt.
While the present invention has been described with reference to a number of specific embodiments, numerous variations are possible. For example, at block 100, CPU 1060, rather than processing sampling path data for context detection of a frame can apply a texture filter set, e.g., a set such as:
For performance of context detection, texture filter statistics can be examined for determining the existence of and a location of a decodable indicia representation. For example, where vertical and horizontal edges and density exceeds a predetermined threshold, CPU 1060 can determine that a linear bar code symbol is represented in image data. Furthermore, texture filters need not be applied by CPU 1060 with reference to stored frames of image data. For example, as shown in
The method as described herein allows for reading of a decodable indicia in an expanded range of scene conditions. In the example of
By operating according to a described method, terminal 1000 can utilize a determined location of a decodable indicia representation for capture of a frame that can be successfully decoded. In a specific example, by operating according to a described method, a portion of an image representation corresponding to decodable indicia 2804 (e.g., a bar code symbol, and OCR characters) can be detected as a decodable indicia representation, pixel values corresponding to indicia 2804 can be utilized for parameter determination for capture of a subsequent frame, and the subsequent frame (or further subsequent frame after one or more iterations of the method) can be subject to a successful decode attempt for decoding of decodable indicia 2302. By contrast, by a process according to a commercially available terminal, the terminal will attempt to set imaging parameters using image data corresponding to energized LEDs 2806, which imaging parameters will not produce image data corresponding to the area of indicia 2804 that is of sufficient quality for decoding. Further according to the described example, terminal 1000 operable according to described method can be operable to decode decodable indicia 2804 irrespective of a position of indicia 2804 within field of view 40. That is, terminal 1000 operational according to a described method can successfully decode decodable indicia 2804 when decodable indicia 2804 is disposed at an arbitrary position within field of view 40. For example, if substrate 2802 is moved to position 2805, terminal 1000 operable according to a described method can utilize pixel values corresponding to the area of position 2805 for determining an imaging parameter, and can subject a subsequent frame captured utilizing the imaging parameter (or a further subsequent frame captured after one or more iterations) to a successful decode attempt.
[End of excerpted sections from referenced U.S. patent application Ser. No. 12/242,244.]
A small sample of systems methods and apparatus that are described herein is as follows:
While the present invention has been described with reference to a number of specific embodiments, it will be understood that the true spirit and scope of the invention should be determined only with respect to claims that can be supported by the present specification. Further, while in numerous cases herein wherein systems and apparatuses and methods are described as having a certain number of elements it will be understood that such systems, apparatuses and methods can be practiced with fewer than the mentioned certain number of elements.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3582884 | Shepard | Jun 1971 | A |
| 3663762 | Joel, Jr. | May 1972 | A |
| 3684868 | Christie et al. | Aug 1972 | A |
| 3723970 | Stoller | Mar 1973 | A |
| 3906166 | Cooper et al. | Sep 1975 | A |
| 4004237 | Kratzer | Jan 1977 | A |
| 4041391 | Deerkoski | Aug 1977 | A |
| 4097847 | Forsen et al. | Jun 1978 | A |
| 4114155 | Raab | Sep 1978 | A |
| 4164628 | Ward et al. | Aug 1979 | A |
| 4210802 | Sakai et al. | Jul 1980 | A |
| 4291410 | Caples et al. | Sep 1981 | A |
| 4315245 | Nakahara et al. | Feb 1982 | A |
| 4435822 | Spencer et al. | Mar 1984 | A |
| 4445118 | Taylor et al. | Apr 1984 | A |
| 4488678 | Hara et al. | Dec 1984 | A |
| 4488679 | Bockholt et al. | Dec 1984 | A |
| 4500776 | Laser | Feb 1985 | A |
| 4538060 | Sakai et al. | Aug 1985 | A |
| 4542528 | Sanner et al. | Sep 1985 | A |
| 4561089 | Rouse et al. | Dec 1985 | A |
| 4570057 | Chadima, Jr. et al. | Feb 1986 | A |
| 4610359 | Muller et al. | Sep 1986 | A |
| 4628532 | Stone et al. | Dec 1986 | A |
| 4636624 | Ishida et al. | Jan 1987 | A |
| 4639932 | Schiff | Jan 1987 | A |
| 4644523 | Horwitz | Feb 1987 | A |
| 4646353 | Tenge et al. | Feb 1987 | A |
| 4653076 | Jerrim et al. | Mar 1987 | A |
| 4686363 | Schoon | Aug 1987 | A |
| 4690530 | Fujino et al. | Sep 1987 | A |
| 4710817 | Ando et al. | Dec 1987 | A |
| 4785463 | Janc et al. | Nov 1988 | A |
| 4791446 | Ishida et al. | Dec 1988 | A |
| 4794239 | Allais | Dec 1988 | A |
| 4807256 | Holmes et al. | Feb 1989 | A |
| 4818856 | Matsushima et al. | Apr 1989 | A |
| 4825058 | Poland | Apr 1989 | A |
| 4841544 | Nuytkens | Jun 1989 | A |
| 4874936 | Chandler et al. | Oct 1989 | A |
| 4877949 | Danielson et al. | Oct 1989 | A |
| 4896029 | Chandler et al. | Jan 1990 | A |
| 4901073 | Kibrick | Feb 1990 | A |
| 4908500 | Baumberger | Mar 1990 | A |
| 4942474 | Akimoto et al. | Jul 1990 | A |
| 4998010 | Chandler et al. | Mar 1991 | A |
| 5001560 | Ericsson | Mar 1991 | A |
| 5019699 | Koenck | May 1991 | A |
| 5113445 | Wang | May 1992 | A |
| 5138140 | Siemiatkowski et al. | Aug 1992 | A |
| 5153418 | Batterman et al. | Oct 1992 | A |
| 5153421 | Tandon et al. | Oct 1992 | A |
| 5155343 | Chandler et al. | Oct 1992 | A |
| 5159340 | Smith | Oct 1992 | A |
| 5189292 | Batterman et al. | Feb 1993 | A |
| 5212777 | Gove et al. | May 1993 | A |
| 5223701 | Batterman et al. | Jun 1993 | A |
| 5227614 | Danielson et al. | Jul 1993 | A |
| 5235167 | Dvorkis et al. | Aug 1993 | A |
| 5237163 | Collins et al. | Aug 1993 | A |
| 5245695 | Basehore | Sep 1993 | A |
| 5262871 | Wilder et al. | Nov 1993 | A |
| 5272323 | Martino | Dec 1993 | A |
| 5278397 | Barkan et al. | Jan 1994 | A |
| 5286960 | Longacre, Jr. et al. | Feb 1994 | A |
| 5294783 | Hammond, Jr. et al. | Mar 1994 | A |
| 5304786 | Pavlidis et al. | Apr 1994 | A |
| 5304787 | Wang | Apr 1994 | A |
| 5308960 | Smith et al. | May 1994 | A |
| 5311001 | Joseph et al. | May 1994 | A |
| 5313533 | Scott | May 1994 | A |
| 5319185 | Obata et al. | Jun 1994 | A |
| 5327441 | Kawazoe et al. | Jul 1994 | A |
| 5331176 | Sant' Anselmo et al. | Jul 1994 | A |
| 5343028 | Figarella et al. | Aug 1994 | A |
| 5345266 | Denyer et al. | Sep 1994 | A |
| 5354977 | Roustaei | Oct 1994 | A |
| 5378881 | Adachi et al. | Jan 1995 | A |
| 5378883 | Batterman et al. | Jan 1995 | A |
| 5392447 | Schlack et al. | Feb 1995 | A |
| 5396054 | Krichever et al. | Mar 1995 | A |
| 5399852 | Zheng et al. | Mar 1995 | A |
| 5401949 | Ziemacki et al. | Mar 1995 | A |
| 5412197 | Smith | May 1995 | A |
| 5414251 | Durbin | May 1995 | A |
| 5418862 | Zheng et al. | May 1995 | A |
| 5420409 | Longacre, Jr. et al. | May 1995 | A |
| 5422470 | Kubo et al. | Jun 1995 | A |
| 5428211 | Zheng et al. | Jun 1995 | A |
| 5430286 | Hammond, Jr. et al. | Jul 1995 | A |
| 5430472 | Curry | Jul 1995 | A |
| 5446271 | Cherry et al. | Aug 1995 | A |
| 5461425 | Fowler et al. | Oct 1995 | A |
| 5463214 | Longacre, Jr. et al. | Oct 1995 | A |
| 5471515 | Fossum et al. | Nov 1995 | A |
| 5471592 | Gove et al. | Nov 1995 | A |
| 5477042 | Wang | Dec 1995 | A |
| 5478997 | Bridgelall et al. | Dec 1995 | A |
| 5478999 | Figarella et al. | Dec 1995 | A |
| 5479515 | Longacre, Jr. | Dec 1995 | A |
| 5481098 | Davis et al. | Jan 1996 | A |
| 5487115 | Surka | Jan 1996 | A |
| 5489769 | Kubo et al. | Feb 1996 | A |
| 5502297 | Sherman | Mar 1996 | A |
| 5504524 | Lu et al. | Apr 1996 | A |
| 5512739 | Chandler et al. | Apr 1996 | A |
| 5517018 | Zheng et al. | May 1996 | A |
| 5521366 | Wang et al. | May 1996 | A |
| 5524068 | Kacandes et al. | Jun 1996 | A |
| 5537431 | Chen et al. | Jul 1996 | A |
| 5545886 | Metlitsky et al. | Aug 1996 | A |
| 5565669 | Liu | Oct 1996 | A |
| 5567934 | Zheng et al. | Oct 1996 | A |
| 5569901 | Bridgelall et al. | Oct 1996 | A |
| 5572006 | Wang et al. | Nov 1996 | A |
| 5585616 | Roxby et al. | Dec 1996 | A |
| 5591955 | Laser | Jan 1997 | A |
| 5591956 | Longacre, Jr. et al. | Jan 1997 | A |
| 5598007 | Bunce et al. | Jan 1997 | A |
| 5600119 | Dvorkis et al. | Feb 1997 | A |
| 5610387 | Bard et al. | Mar 1997 | A |
| 5612524 | Sant'Anselmo et al. | Mar 1997 | A |
| 5621203 | Swartz et al. | Apr 1997 | A |
| 5637849 | Wang et al. | Jun 1997 | A |
| 5638465 | Sano et al. | Jun 1997 | A |
| 5640202 | Kondo et al. | Jun 1997 | A |
| 5663549 | Katz et al. | Sep 1997 | A |
| 5665959 | Fossum et al. | Sep 1997 | A |
| 5666167 | Tults | Sep 1997 | A |
| 5668803 | Tymes et al. | Sep 1997 | A |
| 5672858 | Li et al. | Sep 1997 | A |
| 5698833 | Skinger | Dec 1997 | A |
| 5699447 | Alumot et al. | Dec 1997 | A |
| 5702059 | Chu et al. | Dec 1997 | A |
| 5703349 | Meyerson et al. | Dec 1997 | A |
| 5710417 | Joseph et al. | Jan 1998 | A |
| 5723823 | Bell | Mar 1998 | A |
| 5723853 | Longacre, Jr. et al. | Mar 1998 | A |
| 5723868 | Hammond, Jr. et al. | Mar 1998 | A |
| 5726435 | Hara et al. | Mar 1998 | A |
| 5739518 | Wang | Apr 1998 | A |
| 5756981 | Roustaei et al. | May 1998 | A |
| 5763864 | O'Hagan et al. | Jun 1998 | A |
| 5773806 | Longacre, Jr. | Jun 1998 | A |
| 5773810 | Hussey et al. | Jun 1998 | A |
| 5774357 | Hoffberg et al. | Jun 1998 | A |
| 5780832 | Watanabe et al. | Jul 1998 | A |
| 5780834 | Havens et al. | Jul 1998 | A |
| 5783811 | Feng et al. | Jul 1998 | A |
| 5784102 | Hussey et al. | Jul 1998 | A |
| 5811774 | Ju et al. | Sep 1998 | A |
| 5811784 | Tausch et al. | Sep 1998 | A |
| 5814801 | Wang et al. | Sep 1998 | A |
| 5814803 | Olmstead et al. | Sep 1998 | A |
| 5815200 | Ju et al. | Sep 1998 | A |
| 5818028 | Meyerson et al. | Oct 1998 | A |
| 5818528 | Roth et al. | Oct 1998 | A |
| 5825006 | Longacre, Jr. et al. | Oct 1998 | A |
| 5831254 | Karpen et al. | Nov 1998 | A |
| 5831674 | Ju et al. | Nov 1998 | A |
| 5841121 | Koenck | Nov 1998 | A |
| 5841126 | Fossum et al. | Nov 1998 | A |
| 5867594 | Cymbalski | Feb 1999 | A |
| 5867595 | Cymbalski | Feb 1999 | A |
| 5875108 | Hoffberg et al. | Feb 1999 | A |
| 5877487 | Tani et al. | Mar 1999 | A |
| 5900613 | Koziol et al. | May 1999 | A |
| 5914476 | Gerst, III et al. | Jun 1999 | A |
| 5917171 | Sasai et al. | Jun 1999 | A |
| 5917945 | Cymbalski | Jun 1999 | A |
| 5920477 | Hoffberg et al. | Jul 1999 | A |
| 5926214 | Denyer et al. | Jul 1999 | A |
| 5929418 | Ehrhart et al. | Jul 1999 | A |
| 5932862 | Hussey et al. | Aug 1999 | A |
| 5942741 | Longacre, Jr. et al. | Aug 1999 | A |
| 5949052 | Longacre, Jr. et al. | Sep 1999 | A |
| 5949054 | Karpen et al. | Sep 1999 | A |
| 5949056 | White | Sep 1999 | A |
| 5962838 | Tamburrini | Oct 1999 | A |
| 5965863 | Parker et al. | Oct 1999 | A |
| 5979763 | Wang et al. | Nov 1999 | A |
| 5979768 | Koenck | Nov 1999 | A |
| 5984186 | Tafoya | Nov 1999 | A |
| 5986297 | Guidash et al. | Nov 1999 | A |
| 6003008 | Postrel et al. | Dec 1999 | A |
| 6012640 | Liu | Jan 2000 | A |
| 6017496 | Nova et al. | Jan 2000 | A |
| 6019286 | Li et al. | Feb 2000 | A |
| 6053407 | Wang et al. | Apr 2000 | A |
| 6064763 | Maltsev | May 2000 | A |
| 6070800 | Fujita et al. | Jun 2000 | A |
| 6082619 | Ma et al. | Jul 2000 | A |
| 6082621 | Chan et al. | Jul 2000 | A |
| 6094739 | Miller et al. | Jul 2000 | A |
| 6119179 | Whitridge et al. | Sep 2000 | A |
| 6123264 | Li et al. | Sep 2000 | A |
| 6129278 | Wang et al. | Oct 2000 | A |
| 6144453 | Hallerman et al. | Nov 2000 | A |
| 6152368 | Olmstead et al. | Nov 2000 | A |
| 6155491 | Dueker et al. | Dec 2000 | A |
| 6161760 | Marrs et al. | Dec 2000 | A |
| 6170749 | Goren et al. | Jan 2001 | B1 |
| 6173894 | Olmstead et al. | Jan 2001 | B1 |
| 6176428 | Joseph et al. | Jan 2001 | B1 |
| 6176429 | Reddersen et al. | Jan 2001 | B1 |
| 6179208 | Feng | Jan 2001 | B1 |
| 6186404 | Ehrhart et al. | Feb 2001 | B1 |
| 6215992 | Howell et al. | Apr 2001 | B1 |
| 6219182 | McKinley | Apr 2001 | B1 |
| 6230975 | Colley et al. | May 2001 | B1 |
| 6264105 | Longacre, Jr. et al. | Jul 2001 | B1 |
| 6276605 | Olmstead et al. | Aug 2001 | B1 |
| 6311895 | Olmstead et al. | Nov 2001 | B1 |
| 6315204 | Knighton et al. | Nov 2001 | B1 |
| 6329139 | Nova et al. | Dec 2001 | B1 |
| 6347163 | Roustaei | Feb 2002 | B2 |
| 6360948 | Yang et al. | Mar 2002 | B1 |
| 6371373 | Ma et al. | Apr 2002 | B1 |
| 6385352 | Roustaei | May 2002 | B1 |
| 6398112 | Li et al. | Jun 2002 | B1 |
| 6429934 | Dunn et al. | Aug 2002 | B1 |
| 6443360 | Marchi et al. | Sep 2002 | B1 |
| 6462842 | Hamilton | Oct 2002 | B1 |
| 6486911 | Denyer et al. | Nov 2002 | B1 |
| 6491223 | Longacre, Jr. et al. | Dec 2002 | B1 |
| 6493029 | Denyer et al. | Dec 2002 | B1 |
| 6505778 | Reddersen et al. | Jan 2003 | B1 |
| 6508404 | Hecht | Jan 2003 | B2 |
| 6512218 | Canini et al. | Jan 2003 | B1 |
| 6525827 | Liu | Feb 2003 | B2 |
| 6547139 | Havens et al. | Apr 2003 | B1 |
| 6547142 | Goren et al. | Apr 2003 | B1 |
| 6552323 | Guidash et al. | Apr 2003 | B2 |
| 6552746 | Yang et al. | Apr 2003 | B1 |
| 6565003 | Ma | May 2003 | B1 |
| 6575367 | Longacre, Jr. | Jun 2003 | B1 |
| 6578766 | Parker et al. | Jun 2003 | B1 |
| 6585159 | Meier et al. | Jul 2003 | B1 |
| 6598797 | Lee | Jul 2003 | B2 |
| 6606171 | Renk et al. | Aug 2003 | B1 |
| 6634558 | Patel et al. | Oct 2003 | B1 |
| 6637658 | Barber et al. | Oct 2003 | B2 |
| 6655595 | Longacre, Jr. et al. | Dec 2003 | B1 |
| 6661521 | Stern | Dec 2003 | B1 |
| 6665012 | Yang et al. | Dec 2003 | B1 |
| 6666377 | Harris | Dec 2003 | B1 |
| 6672511 | Shellhammer | Jan 2004 | B1 |
| 6678412 | Shigekusa et al. | Jan 2004 | B1 |
| 6685095 | Roustaei et al. | Feb 2004 | B2 |
| 6698656 | Parker et al. | Mar 2004 | B2 |
| 6714239 | Guidash | Mar 2004 | B2 |
| 6714665 | Hanna et al. | Mar 2004 | B1 |
| 6722569 | Ehrhart et al. | Apr 2004 | B2 |
| 6729546 | Roustaei | May 2004 | B2 |
| 6732929 | Good et al. | May 2004 | B2 |
| 6732930 | Massieu et al. | May 2004 | B2 |
| 6736321 | Tsikos et al. | May 2004 | B2 |
| 6739511 | Tsikos et al. | May 2004 | B2 |
| 6742707 | Tsikos et al. | Jun 2004 | B1 |
| 6814290 | Longacre | Nov 2004 | B2 |
| 6832729 | Perry et al. | Dec 2004 | B1 |
| 6834806 | Benedetti | Dec 2004 | B2 |
| 6837432 | Tsikos et al. | Jan 2005 | B2 |
| 6854649 | Worner et al. | Feb 2005 | B2 |
| 6857570 | Tsikos et al. | Feb 2005 | B2 |
| 6858159 | Lyons | Feb 2005 | B2 |
| 6860428 | Dowling et al. | Mar 2005 | B1 |
| 6863216 | Tsikos et al. | Mar 2005 | B2 |
| 6877665 | Challa et al. | Apr 2005 | B2 |
| 6982800 | Cavill et al. | Jan 2006 | B1 |
| 7055747 | Havens et al. | Jun 2006 | B2 |
| 7059525 | Longacre et al. | Jun 2006 | B2 |
| 7077317 | Longacre, Jr. et al. | Jul 2006 | B2 |
| 7077321 | Longacre, Jr. et al. | Jul 2006 | B2 |
| 7080786 | Longacre, Jr. et al. | Jul 2006 | B2 |
| 7086596 | Meier et al. | Aug 2006 | B2 |
| 7090132 | Havens et al. | Aug 2006 | B2 |
| D528146 | Fitch | Sep 2006 | S |
| 7104456 | Parker et al. | Sep 2006 | B2 |
| 7261238 | Carlson et al. | Aug 2007 | B1 |
| 7270273 | Barber et al. | Sep 2007 | B2 |
| 7540424 | Knowles et al. | Jun 2009 | B2 |
| 7852519 | Meier et al. | Dec 2010 | B2 |
| 20020035715 | Hatakeyama | Mar 2002 | A1 |
| 20020039099 | Harper | Apr 2002 | A1 |
| 20020039197 | Hikichi | Apr 2002 | A1 |
| 20020039457 | Helms et al. | Apr 2002 | A1 |
| 20020135683 | Tamama et al. | Sep 2002 | A1 |
| 20020145043 | Challa et al. | Oct 2002 | A1 |
| 20020158127 | Hori et al. | Oct 2002 | A1 |
| 20020170970 | Ehrhart | Nov 2002 | A1 |
| 20030042311 | Longacre et al. | Mar 2003 | A1 |
| 20030062418 | Barber et al. | Apr 2003 | A1 |
| 20030085282 | Parker et al. | May 2003 | A1 |
| 20030136843 | Ralph et al. | Jul 2003 | A1 |
| 20030146283 | Longacre et al. | Aug 2003 | A1 |
| 20030218067 | Parker et al. | Nov 2003 | A1 |
| 20040004128 | Pettinelli et al. | Jan 2004 | A1 |
| 20040013315 | Li et al. | Jan 2004 | A1 |
| 20040094627 | Parker et al. | May 2004 | A1 |
| 20040195332 | Barber et al. | Oct 2004 | A1 |
| 20040206821 | Longacre et al. | Oct 2004 | A1 |
| 20040256464 | Longacre et al. | Dec 2004 | A1 |
| 20040256465 | Longacre | Dec 2004 | A1 |
| 20040262392 | Longacre et al. | Dec 2004 | A1 |
| 20040262394 | Longacre et al. | Dec 2004 | A1 |
| 20040262395 | Longacre et al. | Dec 2004 | A1 |
| 20040262396 | Longacre et al. | Dec 2004 | A1 |
| 20040262399 | Longacre et al. | Dec 2004 | A1 |
| 20050056699 | Meier et al. | Mar 2005 | A1 |
| 20050103851 | Zhu et al. | May 2005 | A1 |
| 20050161511 | Parker et al. | Jul 2005 | A1 |
| 20060054704 | Fitch et al. | Mar 2006 | A1 |
| 20060091219 | Joseph et al. | May 2006 | A1 |
| 20060097054 | Biss et al. | May 2006 | A1 |
| 20060126129 | Barber et al. | Jun 2006 | A1 |
| 20060175413 | Longacre et al. | Aug 2006 | A1 |
| 20060255150 | Longacre | Nov 2006 | A1 |
| 20060283954 | Ralph et al. | Dec 2006 | A1 |
| 20070023526 | Moore et al. | Feb 2007 | A1 |
| 20070140574 | Yamaguchi et al. | Jun 2007 | A1 |
| 20070284448 | Wang | Dec 2007 | A1 |
| 20090078773 | Carlson et al. | Mar 2009 | A1 |
| 20090108071 | Carlson | Apr 2009 | A1 |
| Number | Date | Country |
|---|---|---|
| 701057 | May 1997 | AU |
| 2156153 | Sep 1994 | CA |
| 2190189 | Sep 1994 | CA |
| 2190190 | Sep 1994 | CA |
| 2234617 | Apr 1997 | CA |
| 1204411 | Jan 1999 | CN |
| 69418598 | Nov 1999 | DE |
| 60123088 | Mar 2007 | DE |
| 685092 | Nov 1999 | DK |
| 1058455 | Dec 2000 | EP |
| 1323119 | Jul 2003 | EP |
| 2134933 | Oct 1999 | ES |
| 2308267 | Jun 1997 | GB |
| 2343079 | Apr 2000 | GB |
| 2344486 | Jun 2000 | GB |
| 2001016548 | Jan 2001 | JP |
| 2004511015 | Apr 2004 | JP |
| WO-9304442 | Mar 1993 | WO |
| WO-9314458 | Jul 1993 | WO |
| WO-9317397 | Sep 1993 | WO |
| WO-9318478 | Sep 1993 | WO |
| WO-9419766 | Sep 1994 | WO |
| WO-9532580 | Nov 1995 | WO |
| WO-9616631 | Jun 1996 | WO |
| WO-9701828 | Jan 1997 | WO |
| WO-9708647 | Mar 1997 | WO |
| WO-9715024 | Apr 1997 | WO |
| WO 0229707 | Apr 2002 | WO |
| WO 2008097521 | Aug 2008 | WO |
| Entry |
|---|
| European Patent Office, Communication dated Jan. 8, 2010 (1 pg.). |
| European Patent Office, European Search Report dated Dec. 14, 2009 (2 pgs.). |
| European Patent Office, Modified Abstract mailed Jan. 8, 2010 (1 pg.). |
| United States Patent and Trademark Office, Office action for U.S. Appl. No. 12/242,244, dated Jul. 9, 2010, (23 pages). |
| United States Patent and Trademark Office, Office action for U.S. Appl. No. 12/242,244, dated Dec. 11, 2009, (18 pages). |
| European Patent Office, Communication Pursuant to Article 94(3) EPC, dated May 21, 2010, 3 pages. |
| International Search Report for International Patent Application No. PCT/US2008/001475, dated Jul. 29, 2008, 3 pages. |
| Written Opinion of the International Searching Authority for International Patent Application No. PCT/US2008/001475, dated Jul. 29, 2008, 7 pages. |
| JP-08506678 (cited with English language translation included in corresponding application WO94/01803). |
| JP-2895236 B2 (cited with English language translation included in corresponding application WO94/19766). |
| JP-10508133 A (cited with English language translation included in corresponding application WO96/13659). |
| Number | Date | Country | |
|---|---|---|---|
| 20100108769 A1 | May 2010 | US |