Methods for optical character recognition (OCR)

Information

  • Patent Grant
  • 11475655
  • Patent Number
    11,475,655
  • Date Filed
    Friday, April 10, 2020
    4 years ago
  • Date Issued
    Tuesday, October 18, 2022
    a year ago
  • CPC
  • Field of Search
    • CPC
    • G06K9/6212
    • G06K9/64
    • G06K2209/01
    • G06K9/228
    • G06K9/325
    • G06K9/6292
    • G06K9/2054
    • G06K9/72
  • International Classifications
    • G06V10/75
    • G06K9/62
    • G06V30/10
    • Term Extension
      244
Abstract
A method is provided for Optical Character Recognition (OCR). A plurality of OCR decoding results each having a plurality of positions is obtained from capturing and decoding a plurality of images of the same one or more OCR characters. A recognized character in each OCR decoding result is compared with the recognized character that occupies an identical position in each of the other OCR decoding results. A number of occurrences that each particular recognized character occupies the identical position in the plurality of OCR decoding results is calculated. An individual confidence score is assigned to each particular recognized character based on the number of occurrences, with a highest individual confidence score assigned to a particular recognized character having the greatest number of occurrences. Determining which particular recognized character has been assigned the highest individual confidence score determines which particular recognized character comprises a presumptively valid character for the identical position.
Description
FIELD OF THE INVENTION

The present invention relates to reducing optical character recognition errors, and more particularly, relates to methods for optical character recognition (OCR).


BACKGROUND

Optical character recognition (referred to herein as OCR) is a useful feature that allows a computing device to recognize text (more particularly, characters thereof) in an image and convert the text of the image into machine-operable text, e.g., ASCII characters. The machine-operable text is considered an “OCR decoding result”. “Machine-operable text” includes text characters that can be processed in a computer, usually as bytes. For example, users can download, photograph, or scan books, documents, product labels, etc. to obtain an image including text. The users can perform OCR on the image so as to recognize the text in the image, thereby allowing a user on his/her computer, mobile phone, tablet, etc. to select, copy, search, and edit the text.


Conventional OCR systems, however, frequently produce OCR errors (referred to as “misreads”) when recognizing and decoding text. Common errors include unrecognizable or improperly converted text, e.g., an “O” (letter O) for a “0” (number zero), or an “E” for a “B”. OCR errors can often render converted text unusable until a user corrects the errors. Improperly converted text can occur, for instance, when an image has a low resolution, blurred text, and/or unclear text. In another instance, conventional OCR systems may improperly convert text because the image may include uncommon characters or an underlying adjacent graphic that obscures the text. Furthermore, OCR systems can recognize illustrations in an image as text when the illustration does not actually include text. Generally speaking, Optical Character Recognition (OCR) has conventionally had an unacceptable misread rate.


In barcode scanning, a voting methodology is used to compare the decoded data string to subsequent decodes and when a sufficient number of identical scans occur, the decode is presumed to be valid and passed onto the host or application. Misreads are rare in barcode scanning and customers expect the same from OCR scanning. Unfortunately, the voting methodology does not work well in optical character recognition due to the frequency of OCR misreads—it may take a lot of scanning before a sufficient number of identical scans occur without a misread, if at all.


With an expected decoding result, a standard computer program may be conventionally used to identify a misread OCR decoding result by identifying a misread character(s) therein. For example, a plurality of OCR decoding results are depicted below, with multiple misreads detected by the standard computer program (for example, in the first line, “THEEASYDOG” should be “THELAZYDOG”):


OCR Line 3: P<UTOTHEEAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<OK


OCR Line 3: P<OTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHBLAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


OCR Line 3: P<UTOTHELAZYDOG<<QUICK<BROWN<FOX<JUMPS<OVER<MISREAD


However, in the usual application in the field in which the decoding result is unknown in advance, the standard computer program is not useable for detecting a misread, and therefore for determining when a valid OCR decoding result has been obtained.


Therefore, a need exists for methods for optical character recognition (OCR). Various embodiments provide a presumptively valid OCR decoding result with a high level of confidence even if every OCR decoding result contains a misread.


SUMMARY

Accordingly, in one aspect, the present invention embraces a method for Optical Character Recognition. A plurality of OCR decoding results each having a plurality of positions is obtained from capturing and decoding a plurality of images of the same one or more OCR characters. A recognized character in each OCR decoding result is compared with the recognized character that occupies an identical position in each of the other OCR decoding results. A number of occurrences that each particular recognized character occupies the identical position in the plurality of OCR decoding results is calculated. An individual confidence score is assigned to each particular recognized character based on the number of occurrences, with a highest individual confidence score assigned to a particular recognized character having the greatest number of occurrences. Determining which particular recognized character has been assigned the highest individual confidence score determines which particular recognized character comprises a presumptively valid character for the identical position.


In another aspect, the present invention embraces a method for Optical Character Recognition (OCR). The method comprises obtaining a plurality of OCR decoding results from capturing and decoding a plurality of images of the same one or more characters. Each OCR decoding result comprises a plurality of positions. A first recognized character of each OCR decoding result of the plurality of OCR decoding results is compared with the first recognized character of each of the other OCR decoding results to determine one or more first recognized characters. Each first recognized character of the one or more first recognized characters is assigned an individual confidence score according to its respective number of occurrences as the first recognized character in the plurality of OCR decoding results. Which particular first recognized character comprises a presumptively valid first recognized character is determined by determining the particular first recognized character that has been assigned a highest individual confidence score.


In another aspect, the present invention embraces a method for Optical Character Recognition (OCR). The method comprises obtaining a plurality of OCR decoding results from capturing and decoding a plurality of images of the same one or more characters. Each OCR decoding result comprises a plurality of positions. Each recognized character of the plurality of OCR decoding results is associated with an individual confidence score. A presumptively valid OCR decoding result is identified from its total confidence score based on a combination of the individual confidence scores assigned to each of the recognized characters thereof, the presumptively valid OCR decoding result having a highest total confidence score.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the present invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram of an exemplary OCR scanner useful for obtaining a plurality of OCR decoding results (such as shown, for example, in the Background and in FIG. 3 below);



FIG. 2 is a flow diagram of a method for optical character recognition (OCR), according to various embodiments of the present invention; and



FIG. 3 graphically depicts a plurality of OCR decoding results, illustrating that methods for determining a presumptively valid OCR decoding result according to various embodiments of the present invention enable determination of a presumptively valid OCR decoding result even though every line of text (decoding result) contains a misread, according to various embodiments of the present invention.





DETAILED DESCRIPTION

Various embodiments are directed to methods for optical character recognition (OCR). Various embodiments reduce misreads and enable misread recognition so that decoding can be aborted. Various embodiments produce dependable OCR decoding results, even if none of the OCR decoding results from capturing and scanning the same line of text and even if none of the individual OCR decoding results is a valid decoding result. Various embodiments enable or generate a presumptively valid OCR decoding result with a high level of confidence in the decoding result. The confidence level may be arbitrarily set.


As used herein, the term “presumptively valid decoding result” refers to a decoding result that may or may not be valid, but is presumed valid because each recognized character in each position of the decoding result has been assigned the highest individual confidence score or a score greater than a minimum threshold.


Various embodiments of the present invention will be described in relation to a hand-held OCR scanner. However, the present invention may be equally applicable to other types and styles of OCR-scanners (e.g., a “page scanner”, such as the type used in Customs (e.g., a flat window upon which the official places a passport face down, the type where the traveler inserts the passport into a slot and the entire image is read, and the type whereby the user physically “swipes” the passport through the reader similar to the way a retailer reads a magnetic stripe)). As used herein, the term “OCR scanner” refers to a device that converts images of typed, handwritten or printed text into machine-encoded text, whether from a scanned document, a photo of a document, an image scan of a product at a point of sale (POS), a scene-photo (for example the text on signs and billboards in a landscape photo) or from subtitle text superimposed on an image (for example from a television broadcast), etc.


Referring now to FIG. 1, an exemplary OCR scanner 10 is depicted. The OCR scanner 10 could be a single purposed device dedicated to OCR scanning, an image based bar code scanner (i.e., imager) such as an imager equipped mobile computer with OCR capability, a linear imager or CCD, or could be a general purpose mobile computing device (MCD) configured by software to scan text using an integrated camera. The exemplary OCR scanner 10 includes a sensor 11 for sensing one or more characters in a line of text 4. The sensor 11 may use a variety of techniques to sense the one or more characters. The sensor may include a laser scanner for scanning a laser across a field of view 6. A collimated beam of laser light (e.g., 630-680 nanometer wavelength) may be swept back and forth along a scan-line 5 aligned with the line of text. A light detector converts the reflected light into a scanned-OCR signal. The scanned-OCR signal may be an electronic signal with a modulated amplitude corresponding to the pattern. This modulated signal may be converted into decoded data by a processor 12 (e.g., one or more controller, digital signal processor (DSP), application specific integrated circuit (ASIC), programmable gate array (PGA), a multi-core processor, and/or programmable logic controller (PLC)) communicatively coupled to the sensor 11.


In various embodiments, the sensor 11 includes an image sensor (e.g., CCD, CMOS sensor, etc.) for capturing images of a field of view 6. To scan a line of text, the field of view 6 of the scanner is positioned to view the line of text and an image is captured with the image sensor. The processor 12 communicatively coupled to the sensor 11 converts the image of the line of text into decoded data (a decoding result).


The exemplary OCR scanner 10 also includes a memory 16 (e.g., read-only memory (ROM), flash memory, a hard-drive, etc.) that stores information. The stored information may include a processor-executable software program for decoding the line of text 4. The processor 12 may access the memory 16 to execute the steps of a decoding program for decoding the line of text.


The decoding software program configures the processor 12 to receive the information from the sensor 11 and convert the scanned-text into a decoding result. In various embodiments, the OCR scanner includes an input/output (I/O) module. The I/O module 13 (e.g., user interface) may display the decoded information (e.g., visually and/or auditory).


The subsystems in the scanner 10 are electrically connected via a coupler (e.g., wires, traces, etc.) to form an interconnection subsystem 15. The interconnection system 15 may include power buses or lines, data buses, instruction buses, address buses, etc., which allow operation of the modules/subsystems and the interaction there between.


The scanner 10 is communicatively connected to a computer network 20 via a wired or wireless data link 19 (e.g., IEEE 802.11). A host computer 21 is also communicatively coupled to the computer network 20. This data link 19 may be accessed by a communication module 17 integrated with the scanner 10. In a wireless configuration, the communication module may communicate with a host device over the network via a variety of communication protocols (e.g., WI-FI®, BLUETOOTH®, CDMA, TDMA, or GSM). In some embodiments, the scanner 10 may incorporate a cellular telephone module to communicate over a cellular network as described in U.S. Pat. No. 6,212,401, which is incorporated in its entirety herein by reference.


Referring now to FIGS. 2 and 3, according to various embodiments of the present invention, a method 100 for Optical Character Recognition (OCR) is disclosed. The method 100 for Optical Character Recognition (OCR) comprises obtaining a plurality of OCR decoding results from capturing and decoding a plurality of images of the same one or more OCR characters (step 200). While the OCR decoding results of FIG. 3 use an OCR font, it is to be understood that OCR decoding results may use other than the OCR font shown, e.g., Arial, Times New Roman, Courier. Obtaining the plurality of OCR decoding results comprises capturing the plurality of images of the same one or more OCR characters and decoding each image after the plurality of images is captured or decoding after each image of the plurality of images is captured. FIG. 3 depicts ten OCR decoding results, each comprising a plurality of positions.


Still referring to FIGS. 2 and 3, according to various embodiments of the present invention, method 100 for Optical Character Recognition (OCR) comprises comparing a recognized character in each OCR decoding result of the plurality of OCR decoding results with the recognized character that occupies an identical position in each of the other OCR decoding results (step 210). For example, a first recognized character of each OCR decoding result of the plurality of OCR decoding results may be compared with the first recognized character of each of the other OCR decoding results to determine one or more first recognized characters. Each succeeding recognized character of each OCR decoding result can be compared with a recognized character occupying the identical position in each of the other OCR decoding results. In the depicted embodiment of FIG. 3 in which there are an exemplary forty-four positions, in the first decoding result, the letter “P” occupies the first position, the character < occupies the second position, the letter “U” occupies the third position, the letter “T” occupies the fourth position, the letter “O” occupies the fifth position, the letter “T” occupies the sixth position, the letter “H” occupies the seventh position, the letter “E” occupies the eighth position, the letter “E” (incorrectly) occupies the ninth position, and so on.


Still referring to FIG. 3, in accordance with various embodiments of the present invention, the letter “P” in the first position in the first decoding result is compared with the recognized character that occupies the identical position (the first position, in this case) in each of the other OCR decoding results. The character “<” in the second position in the first decoding result is compared with the recognized character that occupies the second position in each of the other OCR decoding results. The letter “U” in the third position in the first decoding result is compared with the recognized character that occupies the third position in each of the other OCR decoding results. The letter “T” in the fourth position in the first decoding result is compared with the recognized character that occupies the fourth position in each of the other OCR decoding results. As there are 44 exemplary positions in each of the depicted OCR decoding results of FIG. 3, the comparison step is repeated 44 times in the depicted embodiment to determine the presumptively valid character for each position as hereinafter described.


Still referring to FIGS. 2 and 3, according to various embodiments of the present invention, method 100 for Optical Character Recognition (OCR) comprises calculating a number of occurrences that each particular recognized character occupies the identical position in the plurality of OCR decoding results (step 220). For example, the letter “P” appears as the recognized character in the first position of the line of text in all ten decoding results (i.e., the letter “P” has ten occurrences (occ) and there is no other recognized character in the first position). Therefore, the letter “P” has the highest individual confidence score for the first position and is therefore the presumptively valid character for the first position. The character “<” appears as the recognized character in the second position of the line of text in all ten decoding results (i.e., the character “<” has ten occurrences and there is no other recognized character in the second position). Therefore, the character “<” has the highest individual confidence score for the second position and is therefore the presumptively valid character for the second position. Looking now at the third position in each decoding result of the ten exemplary decoding results, there are nine occurrences of the letter “U” and one occurrence of the letter “O”. The letter “U” in the third position, and not the letter “O”, is the presumptively valid character with more occurrences thereof in the third position relative to the number of occurrences of the letter “O”. There are ten occurrences of the letter “T” in the fourth position, ten occurrences of the letter “O” in the fifth position, ten occurrences of the letter “T” in the sixth position, and ten occurrences of the letter “H” in the seventh position. These recognized characters are therefore presumptively valid characters for each of their respective positions. However, in the eighth position, the letter “E” was misread as “B” in the seventh decoding result and in the ninth position, the letter “L” was misread as the letter “E” in the first OCR decoding result. For the eighth position, the letter “E” is the presumptively valid character with nine occurrences versus one occurrence for the letter “B”. For the ninth position, the letter “L” is the presumptively valid character with nine occurrences, whereas the letter “E” only has one occurrence. As noted previously, each succeeding recognized character of each OCR decoding result can be compared with a recognized character occupying the identical position in each of the other OCR decoding results. The next discrepancy between recognized characters in a respective position is in the letter “O” in DOG (i.e., the 14th position), with the fourth decoding result from the top misreading the number “0” instead of the letter “O”. As the letter “O” occupying the 14th position in nine out of ten decoding results (nine occurrences) versus the number “0” with one occurrence, the letter “O” is the presumptively valid character for the 14th position. In the word “BROWN”, the letter “B” occupies the 24th position in nine decoding results with the letter “E” occupying the 24th position for one occurrence. Therefore, for the 24th position, B is the presumptively valid character. The letter “O” in the word “BROWN” occupies the 26th position in each decoding result seven times out of ten decoding results and the number “0” has only three occurrences in the 27th position. Therefore, the letter “O” is the presumptively valid character for the 27th position. In the word “FOX”, the letter “O” has eight occurrences whereas the number “0” has two occurrences in the 31st position. Therefore, for the 31st position, the letter “O” is the presumptively valid character. In the word “JUMPS”, the letter “P” has nine occurrences whereas the letter “F” (third OCR decoding result of FIG. 3) has only one occurrence. Therefore, the letter “P” is the presumptively valid character for the 37th position as it has more occurrences than the letter “F” for that position. In the word “OVER”, the letter “O” in the 40nd position has eight occurrences whereas the number “0” has only two occurrences, making the letter “O” the presumptively valid character for that position. The number of occurrences that a recognized character occupies a particular position in the plurality of OCR decoding results is calculated.


Stated simply, if the same recognized character occupies the same position in all decoding results, the recognized character is the presumptively valid character for that position (e.g., if the letter “A” occupies the same position in all decoding results, the letter “A” is the presumptively valid character for that position). On the other hand, if there is more than one recognized character occupying the same position in the plurality of decoding results, the presumptively valid character is the recognized character that occupies the same position in a greater number of decoding results (the recognized character with the greater number of occurrences) as hereinafter described.


The recognized character from capturing and decoding each image of the plurality of images may be transmitted to a memory space and each occurrence of the recognized character aggregated in the memory space for calculating the number of occurrences as previously described.


A lexicon for the plurality of OCR decoding results may be predetermined. If the recognized character is not included in the predetermined lexicon, it can be determined whether to include or exclude the recognized character in the calculation of the number of occurrences. For example, the number “0” is often misread as the letter “O”, and vice versa. By including the number “0” in the predetermined lexicon, i.e., it occurs in a numeric-only field such as a passport number, it may be included in the number of occurrences for the letter “O” so as to not over exclude. By excluding the number “0” in the predetermined lexicon, the number “0” could not be a presumptively valid character. Relatedly, different characters can have different levels of confidence depending on their propensity for misreading. Sometimes there are known attributes in a given field that can be applied. For example, for the decoding result “BROWN”, it is known that it is a non-numeric field so that misreads into a digit are not counted.


In various embodiments of the present invention, the method 100 for Optical Character Recognition (OCR) may further comprise verifying that the number of OCR decoding results in which the recognized character occupies the particular position comprises a minimum threshold number of OCR decoding results (step 230). If the calculated number of OCR decoding results in which the recognized character occupies the particular position does not comprise the minimum threshold number of OCR decoding results, additional OCR decoding results may be obtained by capturing and decoding additional images of the same one or more characters (step 240). The comparing, calculating, and assigning steps may then be repeated with the original OCR decoding results and the additional OCR decoding results until the calculated number of OCR decoding results in which the recognized character occupies the particular position comprises the minimum threshold number of OCR decoding result out of the total number of OCR decoding results (a minimum threshold percentage). For example, if a particular recognized character occupies a particular position in only half the decoding results, there may not be confidence that the particular recognized character is the presumptively valid character for that position. The minimum threshold number or percentage can be set to any threshold desired, for example, 7 in 10 or better.


If there is significant background noise or an out of focus situation, it may be possible that no one recognized character emerges as the recognized character with the highest number of occurrences and therefore no one recognized character can be assigned the highest individual confidence score. It may be necessary to obtain additional OCR decoding results (hopefully after eliminating at least some of the background noise or improving the focus). In this situation, the imager can be configured to capture and buffer images continually or the scanner can offer auditory or visual cues to the user to continue scanning.


Still referring to FIGS. 2 and 3, according to various embodiments of the present invention, the method 100 for Optical Character Recognition (OCR) comprises assigning an individual confidence score to each particular recognized character based on its respective number of occurrences, with a highest individual confidence score assigned to a particular recognized character having the greatest number of occurrences for that position (step 250) (i.e., the highest individual confidence score is assigned to a particular recognized character that occupies a particular position within the greatest number of OCR decoding results) (whether that includes original OCR decoding results, or original and additional OCR decoding results). The comparison, calculation, and assignment steps are completed over all recognized characters included in each position of each OCR decoding result of the plurality of OCR decoding results so that each particular recognized character has an individual confidence score.


Still referring to FIGS. 2 and 3, according to various embodiments of the present invention, the method 100 for Optical Character Recognition (OCR) comprises determining which particular recognized character comprises a presumptively valid character for each position by determining which particular recognized character for that position has been assigned the highest individual confidence score (step 260). As noted previously, if only one recognized character occupies the same position in each decoding result, that one particular recognized character has a 100% confidence score that it is the presumptively valid character for that particular position. While only two alternative recognized characters for some of the positions of FIG. 3 are described, it is to be understood that there may be multiple recognized characters in each position, with each occupying the position for a number of occurrences. For example, the 8th position could have had 7 E's, 2 B's, and an F. The recognized character occupying the same position the greatest number of occurrences is the presumptively valid character for that position. If there is a tie in the number of occurrences, it may be necessary to obtain additional decoding results until a particular recognized character leads by a selected margin as the recognized character in that particular position with the highest confidence score.


The method 100 for Optical Character Recognition (OCR) comprises repeating the comparing, calculating, assigning, and determining steps for each position in the decoding result to determine the presumptively valid character for each position. In various embodiments of the present invention, the comparing, calculating, assigning, and determining steps may be completed for each position prior to moving onto the next position where comparison, calculation, assignment, and determination steps are repeated for determining the presumptively valid character for the next position, and so on for each position. However, it is not necessary that all steps be completed in sequence prior to moving onto a next position. It may be possible to return to a position for determining the presumptively valid decoding result.


Still referring to FIGS. 2 and 3, according to various embodiments of the present invention, the method 100 for Optical Character Recognition (OCR) further comprises associating each OCR decoding result with a total confidence score based on a combination of the individual confidence scores assigned to each particular recognized character therein (step 270), identifying the OCR decoding result(s) that is/are associated with the highest total confidence score (step 280), and selecting the OCR decoding result(s) that is/are associated with the highest total confidence score as the presumptively valid OCR decoding result (step 290). There may be more than one presumptively valid OCR decoding result in the plurality of OCR decoding results obtained in step 200, as long as the presumptively valid OCR decoding results are identical.


The OCR decoding result may be selected as the presumptively valid decoding result even when the selected OCR decoding result does not match another OCR decoding result of the plurality of OCR decoding results or when none of the plurality of decoding results is correctly decoded in its entirety because the presumptively valid decoding result is the decoding result with the highest total confidence score.


In various embodiments of the present invention, it is to be understood that the presumptively valid decoding result may alternatively be generated from an ordered combination of the presumptively valid characters (the characters with the highest individual confidence scores), in which case the presumptively valid decoding result may be absent from the plurality of decoding results obtained in step 200 (step 300).



FIG. 3 demonstrates the improvement over the voting methodology used in barcode scanning. As noted previously, voting compares the data string to subsequent decodes and when a sufficient number of identical scans occur, the decode is passed. In accordance with various embodiments of the present invention, every recognized character in the decoding results is analyzed statistically and the decoding result may be a presumptively valid decoding result even if none of the scans matches a subsequent scan.


From the foregoing, it is to be understood that various embodiments determine a presumptively valid optical character recognition (OCR) decoding result. Various embodiments reduce misreads and enable misread recognition so that decoding can be aborted. Various embodiments help determine presumptively valid OCR decoding results, even if none of the individual OCR decoding results obtained is a valid decoding result. Various embodiments enable or generate a presumptively valid OCR decoding result with a high level of confidence in the decoding result.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:


U.S. Pat. Nos. 6,832,725; 7,128,266; 7,159,783; 7,413,127; 7,726,575; 8,294,969; 8,317,105; 8,322,622; 8,366,005; 8,371,507; 8,376,233; 8,381,979; 8,390,909; 8,408,464; 8,408,468; 8,408,469; 8,424,768; 8,448,863; 8,457,013; 8,459,557; 8,469,272; 8,474,712; 8,479,992; 8,490,877; 8,517,271; 8,523,076; 8,528,818; 8,544,737; 8,548,242; 8,548,420; 8,550,335; 8,550,354; 8,550,357; 8,556,174; 8,556,176; 8,556,177; 8,559,767; 8,599,957; 8,561,895; 8,561,903; 8,561,905; 8,565,107; 8,571,307; 8,579,200; 8,583,924; 8,584,945; 8,587,595; 8,587,697; 8,588,869; 8,590,789; 8,596,539; 8,596,542; 8,596,543; 8,599,271; 8,599,957; 8,600,158; 8,600,167; 8,602,309; 8,608,053; 8,608,071; 8,611,309; 8,615,487; 8,616,454; 8,621,123; 8,622,303; 8,628,013; 8,628,015; 8,628,016; 8,629,926; 8,630,491; 8,635,309; 8,636,200; 8,636,212; 8,636,215; 8,636,224; 8,638,806; 8,640,958; 8,640,960; 8,643,717; 8,646,692; 8,646,694; 8,657,200; 8,659,397; 8,668,149; 8,678,285; 8,678,286; 8,682,077; 8,687,282; 8,692,927; 8,695,880; 8,698,949; 8,717,494; 8,717,494; 8,720,783; 8,723,804; 8,723,904; 8,727,223; 8,740,082; 8,740,085; 8,746,563; 8,750,445; 8,752,766; 8,756,059; 8,757,495; 8,760,563; 8,763,909; 8,777,108; 8,777,109; 8,779,898; 8,781,520; 8,783,573; 8,789,757; 8,789,758; 8,789,759; 8,794,520; 8,794,522; 8,794,525; 8,794,526; 8,798,367; 8,807,431; 8,807,432; 8,820,630; 8,822,848; 8,824,692; 8,824,696; 8,842,849; 8,844,822; 8,844,823; 8,849,019; 8,851,383; 8,854,633; 8,866,963; 8,868,421; 8,868,519; 8,868,802; 8,868,803; 8,870,074; 8,879,639; 8,880,426; 8,881,983; 8,881,987; 8,903,172; 8,908,995; 8,910,870; 8,910,875; 8,914,290; 8,914,788; 8,915,439; 8,915,444; 8,916,789; 8,918,250; 8,918,564; 8,925,818; 8,939,374; 8,942,480; 8,944,313; 8,944,327; 8,944,332; 8,950,678; 8,967,468; 8,971,346; 8,976,030; 8,976,368; 8,978,981; 8,978,983; 8,978,984; 8,985,456; 8,985,457; 8,985,459; 8,985,461; 8,988,578; 8,988,590; 8,991,704; 8,996,194; 8,996,384; 9,002,641; 9,007,368; 9,010,641; 9,015,513; 9,016,576; 9,022,288; 9,030,964; 9,033,240; 9,033,242; 9,036,054; 9,037,344; 9,038,911; 9,038,915; 9,047,098; 9,047,359; 9,047,420; 9,047,525; 9,047,531; 9,053,055; 9,053,378; 9,053,380; 9,058,526; 9,064,165; 9,064,165; 9,064,167; 9,064,168; 9,064,254; 9,066,032; 9,070,032; 9,076,459; 9,079,423; 9,080,856; 9,082,023; 9,082,031; 9,084,032; 9,087,250; 9,092,681; 9,092,682; 9,092,683; 9,093,141; 9,098,763; 9,104,929; 9,104,934; 9,107,484; 9,111,159; 9,111,166; 9,135,483; 9,137,009; 9,141,839; 9,147,096; 9,148,474; 9,158,000; 9,158,340; 9,158,953; 9,159,059; 9,165,174; 9,171,543; 9,183,425; 9,189,669; 9,195,844; 9,202,458; 9,208,366; 9,208,367; 9,219,836; 9,224,024; 9,224,027; 9,230,140; 9,235,553; 9,239,950; 9,245,492; 9,248,640; 9,250,652; 9,250,712; 9,251,411; 9,258,033; 9,262,633; 9,262,660; 9,262,662; 9,269,036; 9,270,782; 9,274,812; 9,275,388; 9,277,668; 9,280,693; 9,286,496; 9,298,964; 9,301,427; 9,313,377; 9,317,037; 9,319,548; 9,342,723; 9,361,882; 9,365,381; 9,373,018; 9,375,945; 9,378,403; 9,383,848; 9,384,374; 9,390,304; 9,390,596; 9,411,386; 9,412,242; 9,418,269; 9,418,270; 9,465,967; 9,423,318; 9,424,454; 9,436,860; 9,443,123; 9,443,222; 9,454,689; 9,464,885; 9,465,967; 9,478,983; 9,481,186; 9,487,113; 9,488,986; 9,489,782; 9,490,540; 9,491,729; 9,497,092; 9,507,974; 9,519,814; 9,521,331; 9,530,038; 9,572,901; 9,558,386; 9,606,581; 9,646,189; 9,646,191; 9,652,648; 9,652,653; 9,656,487; 9,659,198; 9,680,282; 9,697,401; 9,701,140; U.S. Design Pat. No. D702,237; U.S. Design Pat. No. D716,285; U.S. Design Pat. No. D723,560; U.S. Design Pat. No. D730,357; U.S. Design Pat. No. D730,901; U.S. Design Pat. No. D730,902; U.S. Design Pat. No. D734,339; U.S. Design Pat. No. D737,321; U.S. Design Pat. No. D754,205; U.S. Design Pat. No. D754,206; U.S. Design Pat. No. D757,009; U.S. Design Pat. No. D760,719; U.S. Design Pat. No. D762,604; U.S. Design Pat. No. D766,244; U.S. Design Pat. No. D777,166; U.S. Design Pat. No. D771,631; U.S. Design Pat. No. D783,601; U.S. Design Pat. No. D785,617; U.S. Design Pat. No. D785,636; U.S. Design Pat. No. D790,505; U.S. Design Pat. No. D790,546; International Publication No. 2013/163789; U.S. Patent Application Publication No. 2008/0185432; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0265880; U.S. Patent Application Publication No. 2011/0202554; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0194692; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2013/0175341; U.S. Patent Application Publication No. 2013/0175343; U.S. Patent Application Publication No. 2013/0257744; U.S. Patent Application Publication No. 2013/0257759; U.S. Patent Application Publication No. 2013/0270346; U.S. Patent Application Publication No. 2013/0292475; U.S. Patent Application Publication No. 2013/0292477; U.S. Patent Application Publication No. 2013/0293539; U.S. Patent Application Publication No. 2013/0293540; U.S. Patent Application Publication No. 2013/0306728; U.S. Patent Application Publication No. 2013/0306731; U.S. Patent Application Publication No. 2013/0307964; U.S. Patent Application Publication No. 2013/0308625; U.S. Patent Application Publication No. 2013/0313324; U.S. Patent Application Publication No. 2013/0332996; U.S. Patent Application Publication No. 2014/0001267; U.S. Patent Application Publication No. 2014/0025584; U.S. Patent Application Publication No. 2014/0034734; U.S. Patent Application Publication No. 2014/0036848; U.S. Patent Application Publication No. 2014/0039693; U.S. Patent Application Publication No. 2014/0049120; U.S. Patent Application Publication No. 2014/0049635; U.S. Patent Application Publication No. 2014/0061306; U.S. Patent Application Publication No. 2014/0063289; U.S. Patent Application Publication No. 2014/0066136; U.S. Patent Application Publication No. 2014/0067692; U.S. Patent Application Publication No. 2014/0070005; U.S. Patent Application Publication No. 2014/0071840; U.S. Patent Application Publication No. 2014/0074746; U.S. Patent Application Publication No. 2014/0076974; U.S. Patent Application Publication No. 2014/0097249; U.S. Patent Application Publication No. 2014/0098792; U.S. Patent Application Publication No. 2014/0100813; U.S. Patent Application Publication No. 2014/0103115; U.S. Patent Application Publication No. 2014/0104413; U.S. Patent Application Publication No. 2014/0104414; U.S. Patent Application Publication No. 2014/0104416; U.S. Patent Application Publication No. 2014/0106725; U.S. Patent Application Publication No. 2014/0108010; U.S. Patent Application Publication No. 2014/0108402; U.S. Patent Application Publication No. 2014/0110485; U.S. Patent Application Publication No. 2014/0125853; U.S. Patent Application Publication No. 2014/0125999; U.S. Patent Application Publication No. 2014/0129378; U.S. Patent Application Publication No. 2014/0131443; U.S. Patent Application Publication No. 2014/0133379; U.S. Patent Application Publication No. 2014/0136208; U.S. Patent Application Publication No. 2014/0140585; U.S. Patent Application Publication No. 2014/0152882; U.S. Patent Application Publication No. 2014/0158770; U.S. Patent Application Publication No. 2014/0159869; U.S. Patent Application Publication No. 2014/0166759; U.S. Patent Application Publication No. 2014/0168787; U.S. Patent Application Publication No. 2014/0175165; U.S. Patent Application Publication No. 2014/0191684; U.S. Patent Application Publication No. 2014/0191913; U.S. Patent Application Publication No. 2014/0197304; U.S. Patent Application Publication No. 2014/0214631; U.S. Patent Application Publication No. 2014/0217166; U.S. Patent Application Publication No. 2014/0231500; U.S. Patent Application Publication No. 2014/0247315; U.S. Patent Application Publication No. 2014/0263493; U.S. Patent Application Publication No. 2014/0263645; U.S. Patent Application Publication No. 2014/0270196; U.S. Patent Application Publication No. 2014/0270229; U.S. Patent Application Publication No. 2014/0278387; U.S. Patent Application Publication No. 2014/0288933; U.S. Patent Application Publication No. 2014/0297058; U.S. Patent Application Publication No. 2014/0299665; U.S. Patent Application Publication No. 2014/0332590; U.S. Patent Application Publication No. 2014/0351317; U.S. Patent Application Publication No. 2014/0362184; U.S. Patent Application Publication No. 2014/0363015; U.S. Patent Application Publication No. 2014/0369511; U.S. Patent Application Publication No. 2014/0374483; U.S. Patent Application Publication No. 2014/0374485; U.S. Patent Application Publication No. 2015/0001301; U.S. Patent Application Publication No. 2015/0001304; U.S. Patent Application Publication No. 2015/0009338; U.S. Patent Application Publication No. 2015/0014416; U.S. Patent Application Publication No. 2015/0021397; U.S. Patent Application Publication No. 2015/0028104; U.S. Patent Application Publication No. 2015/0029002; U.S. Patent Application Publication No. 2015/0032709; U.S. Patent Application Publication No. 2015/0039309; U.S. Patent Application Publication No. 2015/0039878; U.S. Patent Application Publication No. 2015/0040378; U.S. Patent Application Publication No. 2015/0049347; U.S. Patent Application Publication No. 2015/0051992; U.S. Patent Application Publication No. 2015/0053769; U.S. Patent Application Publication No. 2015/0062366; U.S. Patent Application Publication No. 2015/0063215; U.S. Patent Application Publication No. 2015/0088522; U.S. Patent Application Publication No. 2015/0096872; U.S. Patent Application Publication No. 2015/0100196; U.S. Patent Application Publication No. 2015/0102109; U.S. Patent Application Publication No. 2015/0115035; U.S. Patent Application Publication No. 2015/0127791; U.S. Patent Application Publication No. 2015/0128116; U.S. Patent Application Publication No. 2015/0133047; U.S. Patent Application Publication No. 2015/0134470; U.S. Patent Application Publication No. 2015/0136851; U.S. Patent Application Publication No. 2015/0142492; U.S. Patent Application Publication No. 2015/0144692; U.S. Patent Application Publication No. 2015/0144698; U.S. Patent Application Publication No. 2015/0149946; U.S. Patent Application Publication No. 2015/0161429; U.S. Patent Application Publication No. 2015/0178523; U.S. Patent Application Publication No. 2015/0178537; U.S. Patent Application Publication No. 2015/0178685; U.S. Patent Application Publication No. 2015/0181109; U.S. Patent Application Publication No. 2015/0199957; U.S. Patent Application Publication No. 2015/0210199; U.S. Patent Application Publication No. 2015/0212565; U.S. Patent Application Publication No. 2015/0213647; U.S. Patent Application Publication No. 2015/0220753; U.S. Patent Application Publication No. 2015/0220901; U.S. Patent Application Publication No. 2015/0227189; U.S. Patent Application Publication No. 2015/0236984; U.S. Patent Application Publication No. 2015/0239348; U.S. Patent Application Publication No. 2015/0242658; U.S. Patent Application Publication No. 2015/0248572; U.S. Patent Application Publication No. 2015/0254485; U.S. Patent Application Publication No. 2015/0261643; U.S. Patent Application Publication No. 2015/0264624; U.S. Patent Application Publication No. 2015/0268971; U.S. Patent Application Publication No. 2015/0269402; U.S. Patent Application Publication No. 2015/0288689; U.S. Patent Application Publication No. 2015/0288896; U.S. Patent Application Publication No. 2015/0310243; U.S. Patent Application Publication No. 2015/0310244; U.S. Patent Application Publication No. 2015/0310389; U.S. Patent Application Publication No. 2015/0312780; U.S. Patent Application Publication No. 2015/0327012; U.S. Patent Application Publication No. 2016/0014251; U.S. Patent Application Publication No. 2016/0025697; U.S. Patent Application Publication No. 2016/0026838; U.S. Patent Application Publication No. 2016/0026839; U.S. Patent Application Publication No. 2016/0040982; U.S. Patent Application Publication No. 2016/0042241; U.S. Patent Application Publication No. 2016/0057230; U.S. Patent Application Publication No. 2016/0062473; U.S. Patent Application Publication No. 2016/0070944; U.S. Patent Application Publication No. 2016/0092805; U.S. Patent Application Publication No. 2016/0101936; U.S. Patent Application Publication No. 2016/0104019; U.S. Patent Application Publication No. 2016/0104274; U.S. Patent Application Publication No. 2016/0109219; U.S. Patent Application Publication No. 2016/0109220; U.S. Patent Application Publication No. 2016/0109224; U.S. Patent Application Publication No. 2016/0112631; U.S. Patent Application Publication No. 2016/0112643; U.S. Patent Application Publication No. 2016/0117627; U.S. Patent Application Publication No. 2016/0124516; U.S. Patent Application Publication No. 2016/0125217; U.S. Patent Application Publication No. 2016/0125342; U.S. Patent Application Publication No. 2016/0125873; U.S. Patent Application Publication No. 2016/0133253; U.S. Patent Application Publication No. 2016/0171597; U.S. Patent Application Publication No. 2016/0171666; U.S. Patent Application Publication No. 2016/0171720; U.S. Patent Application Publication No. 2016/0171775; U.S. Patent Application Publication No. 2016/0171777; U.S. Patent Application Publication No. 2016/0174674; U.S. Patent Application Publication No. 2016/0178479; U.S. Patent Application Publication No. 2016/0178685; U.S. Patent Application Publication No. 2016/0178707; U.S. Patent Application Publication No. 2016/0179132; U.S. Patent Application Publication No. 2016/0179143; U.S. Patent Application Publication No. 2016/0179368; U.S. Patent Application Publication No. 2016/0179378; U.S. Patent Application Publication No. 2016/0180130; U.S. Patent Application Publication No. 2016/0180133; U.S. Patent Application Publication No. 2016/0180136; U.S. Patent Application Publication No. 2016/0180594; U.S. Patent Application Publication No. 2016/0180663; U.S. Patent Application Publication No. 2016/0180678; U.S. Patent Application Publication No. 2016/0180713; U.S. Patent Application Publication No. 2016/0185136; U.S. Patent Application Publication No. 2016/0185291; U.S. Patent Application Publication No. 2016/0186926; U.S. Patent Application Publication No. 2016/0188861; U.S. Patent Application Publication No. 2016/0188939; U.S. Patent Application Publication No. 2016/0188940; U.S. Patent Application Publication No. 2016/0188941; U.S. Patent Application Publication No. 2016/0188942; U.S. Patent Application Publication No. 2016/0188943; U.S. Patent Application Publication No. 2016/0188944; U.S. Patent Application Publication No. 2016/0189076; U.S. Patent Application Publication No. 2016/0189087; U.S. Patent Application Publication No. 2016/0189088; U.S. Patent Application Publication No. 2016/0189092; U.S. Patent Application Publication No. 2016/0189284; U.S. Patent Application Publication No. 2016/0189288; U.S. Patent Application Publication No. 2016/0189366; U.S. Patent Application Publication No. 2016/0189443; U.S. Patent Application Publication No. 2016/0189447; U.S. Patent Application Publication No. 2016/0189489; U.S. Patent Application Publication No. 2016/0192051; U.S. Patent Application Publication No. 2016/0202951; U.S. Patent Application Publication No. 2016/0202958; U.S. Patent Application Publication No. 2016/0202959; U.S. Patent Application Publication No. 2016/0203021; U.S. Patent Application Publication No. 2016/0203429; U.S. Patent Application Publication No. 2016/0203797; U.S. Patent Application Publication No. 2016/0203820; U.S. Patent Application Publication No. 2016/0204623; U.S. Patent Application Publication No. 2016/0204636; U.S. Patent Application Publication No. 2016/0204638; U.S. Patent Application Publication No. 2016/0227912; U.S. Patent Application Publication No. 2016/0232891; U.S. Patent Application Publication No. 2016/0292477; U.S. Patent Application Publication No. 2016/0294779; U.S. Patent Application Publication No. 2016/0306769; U.S. Patent Application Publication No. 2016/0314276; U.S. Patent Application Publication No. 2016/0314294; U.S. Patent Application Publication No. 2016/0316190; U.S. Patent Application Publication No. 2016/0323310; U.S. Patent Application Publication No. 2016/0325677; U.S. Patent Application Publication No. 2016/0327614; U.S. Patent Application Publication No. 2016/0327930; U.S. Patent Application Publication No. 2016/0328762; U.S. Patent Application Publication No. 2016/0330218; U.S. Patent Application Publication No. 2016/0343163; U.S. Patent Application Publication No. 2016/0343176; U.S. Patent Application Publication No. 2016/0364914; U.S. Patent Application Publication No. 2016/0370220; U.S. Patent Application Publication No. 2016/0372282; U.S. Patent Application Publication No. 2016/0373847; U.S. Patent Application Publication No. 2016/0377414; U.S. Patent Application Publication No. 2016/0377417; U.S. Patent Application Publication No. 2017/0010141; U.S. Patent Application Publication No. 2017/0010328; U.S. Patent Application Publication No. 2017/0010780; U.S. Patent Application Publication No. 2017/0016714; U.S. Patent Application Publication No. 2017/0018094; U.S. Patent Application Publication No. 2017/0046603; U.S. Patent Application Publication No. 2017/0047864; U.S. Patent Application Publication No. 2017/0053146; U.S. Patent Application Publication No. 2017/0053147; U.S. Patent Application Publication No. 2017/0053647; U.S. Patent Application Publication No. 2017/0055606; U.S. Patent Application Publication No. 2017/0060316; U.S. Patent Application Publication No. 2017/0061961; U.S. Patent Application Publication No. 2017/0064634; U.S. Patent Application Publication No. 2017/0083730; U.S. Patent Application Publication No. 2017/0091502; U.S. Patent Application Publication No. 2017/0091706; U.S. Patent Application Publication No. 2017/0091741; U.S. Patent Application Publication No. 2017/0091904; U.S. Patent Application Publication No. 2017/0092908; U.S. Patent Application Publication No. 2017/0094238; U.S. Patent Application Publication No. 2017/0098947; U.S. Patent Application Publication No. 2017/0100949; U.S. Patent Application Publication No. 2017/0108838; U.S. Patent Application Publication No. 2017/0108895; U.S. Patent Application Publication No. 2017/0118355; U.S. Patent Application Publication No. 2017/0123598; U.S. Patent Application Publication No. 2017/0124369; U.S. Patent Application Publication No. 2017/0124396; U.S. Patent Application Publication No. 2017/0124687; U.S. Patent Application Publication No. 2017/0126873; U.S. Patent Application Publication No. 2017/0126904; U.S. Patent Application Publication No. 2017/0139012; U.S. Patent Application Publication No. 2017/0140329; U.S. Patent Application Publication No. 2017/0140731; U.S. Patent Application Publication No. 2017/0147847; U.S. Patent Application Publication No. 2017/0150124; U.S. Patent Application Publication No. 2017/0169198; U.S. Patent Application Publication No. 2017/0171035; U.S. Patent Application Publication No. 2017/0171703; U.S. Patent Application Publication No. 2017/0171803; U.S. Patent Application Publication No. 2017/0180359; U.S. Patent Application Publication No. 2017/0180577; U.S. Patent Application Publication No. 2017/0181299; U.S. Patent Application Publication No. 2017/0190192; U.S. Patent Application Publication No. 2017/0193432; U.S. Patent Application Publication No. 2017/0193461; U.S. Patent Application Publication No. 2017/0193727; U.S. Patent Application Publication No. 2017/0199266; U.S. Patent Application Publication No. 2017/0200108; and U.S. Patent Application Publication No. 2017/0200275.


In the specification and/or figures, typical embodiments of the present invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method for Optical Character Recognition (OCR), the method comprising: obtaining a plurality of OCR decoding results, wherein each OCR decoding result comprises a plurality of positions;calculating, for each of the plurality of OCR decoding results, a number of occurrences of a first character in a first position of the plurality of positions and the number of occurrences of a second character in the first position;determining that the first character in the first position of a first OCR decoding result of the plurality of OCR decoding results is not in a predefined lexicon and incrementing the number of occurrences of the first character;associating the first character with an individual confidence score based on the number of occurrences of the first character;associating the second character with the individual confidence score based on the number of occurrences of the second character; anddetermining a presumptively valid OCR decoding character result for the first position based on the individual confidence score.
  • 2. The method according to claim 1, wherein obtaining the plurality of OCR decoding results further comprises: capturing a plurality of images of the same one or more OCR characters; anddecoding each of the plurality of images to a character in each of the plurality of positions.
  • 3. The method according to claim 1, further comprising: calculating, for each of the plurality of OCR decoding results, the number of occurrences of each additional character in each remaining position of the plurality of positions, wherein the number of occurrences of each additional character is incremented in an instance in which a character is not in the predefined lexicon;associating each additional character in each remaining position with the individual confidence score; anddetermining the presumptively valid OCR decoding character result for each remaining position based on the individual confidence score.
  • 4. The method according to claim 2, further comprising: associating each presumptively valid OCR decoding character result with a total confidence score based on the individual confidence score; andselecting a final OCR decoding result that is associated with a highest total confidence score.
  • 5. The method according to claim 2, further comprising: selecting an identified character for each position of the OCR decoding result based on the identified character in each position having a highest total confidence score for that position across each OCR decoding character result.
  • 6. The method according to claim 1, further comprising: determining that the number of occurrences of the first character or the number of occurrences of the second character do not satisfy a particular threshold; andobtaining additional OCR decoding results.
  • 7. The method according to claim 1, further comprising: determining that the number of occurrences of the first character and the number of occurrences of the second character do not satisfy a particular threshold; andobtaining additional OCR decoding results.
  • 8. The method according to claim 1, further comprising: determining that the individual confidence score of the first character does not satisfy a minimum individual confidence score; andexcluding the first character.
  • 9. An optical character recognition (OCR) apparatus comprising: an image sensor for capturing one or more images; a memory; anda processor communicatively coupled to the image sensor to convert the one or more images into a plurality of OCR decoding results; anda non-transitory memory including computer program instructions configured to, when executed by the processor, cause the OCR apparatus to: calculate, for each of the plurality of OCR decoding results, a number of occurrences of a first character in a first position of a plurality of positions and the number of occurrences of a second character in the first position;determine that the first character in the first position of a first OCR decoding result of the plurality of OCR decoding results is not in a predefined lexicon and incrementing the number of occurrences of the first character;associate the first character with an individual confidence score based on the number of occurrences of the first character;associate the second character with the individual confidence score based on the number of occurrences of the second character; anddetermine a presumptively valid OCR decoding character result for the first position based on the individual confidence score.
  • 10. The OCR apparatus according to claim 9, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: calculate, for each OCR decoding result of the plurality of OCR decoding results, the number of occurrences of each additional character in each remaining position of the plurality of positions, wherein the number of occurrences of each additional character is incremented in an instance in which a character is not in the predefined lexicon;associate each additional character in each remaining position with the individual confidence score; anddetermine the presumptively valid OCR decoding character result for each remaining position based on the individual confidence score.
  • 11. The OCR apparatus according to claim 10, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: associate each presumptively valid OCR decoding character result with a total confidence score based on the individual confidence score; andselect a final OCR decoding result that is associated with a highest total confidence score.
  • 12. The OCR apparatus according to claim 10, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: select an identified character for each position of the OCR decoding result based on the identified character in each position having a highest total confidence score for that position across each OCR decoding character result.
  • 13. The OCR apparatus according to claim 9, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: determine that the number of occurrences of the first character or the number of occurrences of the second character do not satisfy a particular threshold; andobtain additional OCR decoding results.
  • 14. The OCR apparatus according to claim 9, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: determine that the number of occurrences of the first character and the number of occurrences of the second character do not satisfy a particular threshold; andobtain additional OCR decoding results.
  • 15. The OCR apparatus according to claim 9, wherein the non-transitory memory including the computer program instructions is further configured to, when executed by the processor, cause the OCR apparatus to: determine that the individual confidence score of the first character does not satisfy a minimum individual confidence score; andexclude the first character.
  • 16. A non-transitory computer program product for Optical Character Recognition (OCR), wherein the non-transitory computer program product comprises: a non-transitory computer readable storage medium having computer readable program code embodied in said non-transitory computer readable storage medium, said computer readable program code comprising steps of: obtaining a plurality of OCR decoding results, wherein each OCR decoding result comprises a plurality of positions;calculating, for each of the plurality of OCR decoding results, a number of occurrences of a first character in a first position of the plurality of positions and the number of occurrences of a second character in the first position;determining that the first character in the first position of a first OCR decoding result of the plurality of OCR decoding results is not in a predefined lexicon and incrementing the number of occurrences of the first character;associating the first character with an individual confidence score based on the number of occurrences of the first character;associating the second character with the individual confidence score based on the number of occurrences of the second character; anddetermining a presumptively valid OCR decoding character result for the first position based on the individual confidence score.
  • 17. The non-transitory computer program product according to claim 16, wherein said computer readable program code further comprising the steps of: capturing a plurality of images of the same one or more OCR characters; anddecoding each of the plurality of images to a character in each of the plurality of positions.
  • 18. The non-transitory computer program product according to claim 16, wherein said computer readable program code further comprising the steps of: calculating, for each of the plurality of OCR decoding results, the number of occurrences of each additional character in each remaining position of the plurality of positions, wherein the number of occurrences of each additional character is incremented in an instance in which a character is not in the predefined lexicon;associating each additional character in each remaining position with the individual confidence score; anddetermining the presumptively valid OCR decoding character result for each remaining position based on the individual confidence score.
  • 19. The non-transitory computer program product according to claim 18, wherein said computer readable program code further comprising the steps of: associating each presumptively valid OCR decoding character result with a total confidence score based on the individual confidence score; andselecting a final OCR decoding result that is associated with a highest total confidence score.
  • 20. The non-transitory computer program product according to claim 18, wherein said computer readable program code further comprising a step of: selecting an identified character for each position of the OCR decoding result based on the identified character in each position having a highest total confidence score for that position across each OCR decoding character result.
CROSS-REFERENCE TO RELATED APPLICATION

The present application is a continuation application of U.S. patent application Ser. No. 15/720,550, filed on Sep. 29, 2017, the entire content of which is incorporated by reference into the present application.

US Referenced Citations (669)
Number Name Date Kind
6212401 Ackley Apr 2001 B1
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7620268 Myers et al. Nov 2009 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Van et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein, Jr. Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre, Jr. Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz, Sr. Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783570 Wang et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue et al. Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein, Jr. Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 El et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053350 Abdulkader et al. Jun 2015 B1
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9058536 Yuan Jun 2015 B1
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9082031 Liu et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
D737321 Lee Aug 2015 S
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham, IV Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein, Jr. Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Smith Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
D754205 Nguyen et al. Apr 2016 S
D754206 Nguyen et al. Apr 2016 S
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 Mccloskey et al. May 2016 B2
9360304 Xue et al. Jun 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390304 Chang et al. Jul 2016 B2
9390419 Kumar et al. Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein, Jr. Aug 2016 B2
9412242 Van et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van et al. Aug 2016 B2
9423318 Liu et al. Aug 2016 B2
9424454 Tao et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9436860 Smith et al. Sep 2016 B2
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9454689 Mccloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9487113 Schukalski Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
10621470 Ackley Apr 2020 B2
20070063048 Havens et al. Mar 2007 A1
20080185432 Caballero et al. Aug 2008 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168511 Kotlarsky et al. Jul 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120194692 Mers et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120228382 Havens et al. Sep 2012 A1
20120248188 Kearney Oct 2012 A1
20130004076 Koo Jan 2013 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130082104 Kearney et al. Apr 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedrao Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130332996 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein, Jr. Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140097249 Gomez et al. Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 Mccloskey et al. Apr 2014 A1
20140104414 Mccloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140106725 Sauerwein, Jr. Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191684 Valois Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 Digregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150039878 Barten Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178537 El et al. Jun 2015 A1
20150178685 Krumel et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150212565 Murawski et al. Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150220901 Gomez et al. Aug 2015 A1
20150227189 Davis et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150239348 Chamberlin Aug 2015 A1
20150242658 Nahill et al. Aug 2015 A1
20150248572 Soule et al. Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150264624 Wang et al. Sep 2015 A1
20150268971 Barten Sep 2015 A1
20150269402 Barber et al. Sep 2015 A1
20150288689 Todeschini et al. Oct 2015 A1
20150288896 Wang Oct 2015 A1
20150310243 Ackley et al. Oct 2015 A1
20150310244 Xian et al. Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160025697 Alt et al. Jan 2016 A1
20160026838 Gillet et al. Jan 2016 A1
20160026839 Qu et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160070944 Mccloskey et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 Mccloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160125873 Braho et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 Mccloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 Mcmahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Franz Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini et al. Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 Dipiazza et al. Jun 2016 A1
20160192051 Dipiazza et al. Jun 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggerty et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Wilz et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160316190 Mccloskey et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Germaine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress, Jr. Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 Mccloskey et al. May 2017 A1
20170126873 Mcgary et al. May 2017 A1
20170126904 D'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Celinder et al. Jul 2017 A1
20170193727 Van et al. Jul 2017 A1
20170199266 Rice et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 Mccloskey et al. Jul 2017 A1
Foreign Referenced Citations (1)
Number Date Country
2013163789 Nov 2013 WO
Non-Patent Literature Citations (4)
Entry
U.S. Appl. No. 15/720,550, filed Sep. 29, 2017, U.S. Pat. No. 10,621,470, Patented.
Final Rejection dated May 17, 2019 for U.S. Appl. No. 15/720,550.
Non-Final Rejection dated Jan. 23, 2019 for U.S. Appl. No. 15/720,550.
Notice of Allowance and Fees Due (PTOL-85) dated Dec. 11, 2019 for U.S. Appl. No. 15/720,550.
Related Publications (1)
Number Date Country
20200250469 A1 Aug 2020 US
Continuations (1)
Number Date Country
Parent 15720550 Sep 2017 US
Child 16846041 US