Optical character recognition systems and methods

Information

  • Patent Grant
  • 11593591
  • Patent Number
    11,593,591
  • Date Filed
    Tuesday, May 5, 2020
    4 years ago
  • Date Issued
    Tuesday, February 28, 2023
    a year ago
  • CPC
  • Field of Search
    • CPC
    • G06K2209/01
    • G06K19/06028
    • G06K7/1095
    • G06K7/1413
    • G06K7/1443
    • G06K9/58
    • G06K9/723
  • International Classifications
    • G06K9/62
    • G06V10/88
    • G06V10/75
    • G06V30/244
    • G06V30/32
    • G06V30/224
    • G06V30/10
    • Term Extension
      157
Abstract
The present disclosure is generally directed to systems and methods for executing optical character recognition faster than at least some traditional OCR systems, without sacrificing recognition accuracy. Towards this end, various exemplary embodiments involve the use of a bounding box and a grid-based template to identify certain unique aspects of each of various characters and/or numerals. For example, in one embodiment, the grid-based template can be used to recognize a numeral and/or a character based on a difference in centerline height between the numeral and the character when a monospaced font is used. In another exemplary embodiment, the grid-based template can be used to recognize an individual digit among a plurality of digits based on certain parts of the individual digit being uniquely located in specific portions of the grid-based template.
Description
FIELD OF THE INVENTION

The present invention generally relates to optical character recognition systems and more particularly relates to systems and methods for improving recognition speed.


BACKGROUND

Traditional optical character recognition (OCR) systems often tend to sacrifice speed in the interests of ensuring accuracy in character recognition. The traditional character recognition process typically incorporates a template to execute a character-by-character recognition of various characters. The template, which can be one of a number of different types of templates, is associated with a pattern-matching algorithm that identifies a specific numeral or a letter of an alphabet. Certain characters such as the numeral zero and the letter O are relatively similar to each other. Consequently, the process of using the pattern-matching algorithm tends to be slow in order to ensure that such characters are not misinterpreted. However, it is desirable to provide systems and methods that provide for faster optical character recognition without sacrificing accuracy.


SUMMARY

In an exemplary embodiment in accordance with the disclosure, a method includes using an optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a grid-based template to a character having a monospaced font; defining in the grid-based template, a first grid section that includes a first portion of the character when the character has a first size and excludes the first portion of the character when the character has a second size that is smaller than the first size; and recognizing the character as a numeral when the first grid section includes the first portion of the character.


In another exemplary embodiment in accordance with the disclosure, a method includes providing to an optical character recognition system, a barcode label containing a plurality of digits, and using the optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a bounding box to an individual digit among the plurality of digits contained in the barcode label; applying a grid-based template to the bounding box, the grid-based template comprising a plurality of grid sections; and using the plurality of grid sections to identify the individual digit contained in the bounding box.


In yet another exemplary embodiment in accordance with the disclosure, a method includes using an optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a bounding box to a character; applying a grid-based template to the bounding box; defining a portion of the grid-based template as a primary search area; and using at least the primary search area to identify the character contained in the bounding box.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages described in this disclosure, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts an exemplary embodiment of an OCR system in accordance with the disclosure.



FIG. 2 schematically depicts another exemplary embodiment of an OCR system in accordance with the disclosure.



FIG. 3 shows an exemplary grid-based template that can be used to execute optical character recognition in accordance with the disclosure.



FIG. 4 shows an exemplary grid-based template that can be used to execute optical character recognition for identifying a specific letter of an alphabet in accordance with the disclosure.



FIG. 5 shows an exemplary list of coordinate locations in a grid-based template that can be used to uniquely identify letters of the English alphabet in accordance with the disclosure.



FIG. 6 shows an exemplary grid-based template when used to execute optical character recognition for identifying a specific numeral in accordance with the disclosure.



FIG. 7 shows a flowchart of a method to execute optical character recognition in accordance with the disclosure.



FIG. 8 shows an exemplary set of coordinate locations and a look-up table that can be used in conjunction with the set of coordinate locations to uniquely identify any character using a single-step recognition procedure in accordance with the disclosure.





DETAILED DESCRIPTION

Throughout this description, embodiments and variations are described for the purpose of illustrating uses and implementations of inventive concepts. The illustrative description should be understood as presenting examples of inventive concepts, rather than as limiting the scope of the concepts as disclosed herein. Towards this end, certain words and terms are used herein solely for convenience and such words and terms should be broadly understood as encompassing various objects and actions that are generally understood in various forms and equivalencies by persons of ordinary skill in the art. For example, the word “numeral” as used herein is equally applicable to other words such as “digit” and “number.” The word “character” as used herein pertains to any printed or written material that is recognizable using optical character recognition techniques. It should also be understood that the word “example” as used herein is intended to be non-exclusionary and non-limiting in nature. More particularly, the word “exemplary” as used herein indicates one among several examples and it should be understood that no special emphasis, exclusivity, or preference, is associated or implied by the use of this word.


The present disclosure is generally directed to systems and methods for executing optical character recognition faster than at least some traditional OCR systems, without sacrificing recognition accuracy. Towards this end, various exemplary embodiments involve the use of a bounding box and a grid-based template to identify certain unique aspects of each of various characters and/or numerals. For example, in one embodiment, the grid-based template can be used to recognize a numeral and/or a character based on a difference in centerline height between the numeral and the character when a monospaced font is used. In another exemplary embodiment, the grid-based template can be used to recognize an individual digit among a plurality of digits based on certain parts of the individual digit being uniquely located in specific portions of the grid-based template.



FIG. 1 schematically depicts a depicts an exemplary OCR system 100 in accordance with the disclosure. The OCR system 100 depicted in this exemplary embodiment includes an image scanning system 110 (a flat-bed scanner in this example) communicatively coupled to a computer 105. In other embodiments, various other hardware elements such as a handheld scanner or an overhead scanner can constitute the image scanning system 110.


When in operation, the image scanning system 110 captures an image of characters and text located on an object 107 such as a printed sheet of paper or a machine-readable zone (MRZ) on a passport. The captured image is provided to the computer 105 via a communication link 106 (wire, communications cable, wireless etc.). The computer 105 includes OCR software that is used to carry out OCR operations upon the captured image in accordance with the disclosure.


In an alternative embodiment, the computer 105 can be omitted and the OCR software incorporated into the image scanning system 110, which operates as a multifunction unit to execute various operations such as scanning, printing, faxing, and OCR.


In yet another alternative embodiment, the image scanning system 110 can be omitted and the computer 105 configured to generate a document and/or receive a document via a communication network such as the Internet. OCR software contained in the computer 105 in accordance with the disclosure can then be used to carry out OCR operations upon the received/generated document.



FIG. 2 schematically depicts an exemplary OCR system 200 in accordance with the disclosure. OCR system 200 includes an image scanning system 210 communicatively coupled to a processing system 205 via a communications link 207 (such as a wire, a communications cable, a wireless link, or a metal track on a printed circuit board). The image scanning system 210 includes a light source 211 that projects light through a transparent window 213 upon an object 214. The object 214, which can be a sheet of paper containing text and/or images, reflects the light towards an image sensor 212. The image sensor 212, which contains light sensing elements such as photodiodes and/or photocells, converts the received light into electrical signals (digital bits for example) that are transmitted to the OCR software 206 contained in the processing system 205. In one example embodiment, the OCR system 200 is a slot scanner incorporating a linear array of photocells. The OCR software 206 that is a part of the processing system 205 can be used in accordance with the disclosure to operate upon the electrical signals for performing optical character recognition of the material printed upon the object 214.



FIG. 3 shows a first exemplary grid-based template 300 that can be used to execute optical character recognition in accordance with the disclosure. The exemplary grid-based template 300, which is executed in the form of a software algorithm, has a rectangular shape with an x-axis having a first set of numerical coordinates (ranging from −40 to +40 in this example) and a y-axis having a second set of numerical coordinates (ranging from 0 to 140 in this example). The numerical coordinates of this x-y mapping system can be used to define various grid sections in the grid-based template 300 (grid section 305, grid section 310, grid section 315, etc.).


More particularly, in accordance with the disclosure, the grid-based template 300 can be used to perform character recognition upon various text characters such as a numeral “9” and a letter “P” of an alphabet that are shown as examples. In one or more exemplary implementations, the characters can conform to what is known in the industry as an OCR font. One specific OCR font that is popular in the industry is an OCR-B font. The OCR-B font, which resembles an Ariel font to some extent, is used on various items such as passports, car license plates, and as human-readable characters in barcode symbols (European Article Number (EAN) barcodes and Universal Product Code (U. P. C.) barcodes for example).


The OCR-B font has been specifically tailored to minimize machine reading errors. However, even with such tailoring, it is often quite time-consuming for traditional OCR systems to execute character recognition, because each character has to be recognized using a multi-step template-based matching procedure involving the use of a number of templates. For example, the International Civil Aviation Organization (ICAO) uses thirty-seven templates for identifying various characters in a passport. Thus, for a symbol containing 20 characters, a traditional multi-step template-based matching procedure can involve executing 740 match attempts using the thirty-seven templates (20×37=740).


The use of the grid-based template 300 to perform character recognition in accordance with the disclosure will now be described. The similarity in shapes between these two exemplary characters (“9” and “P”) poses a challenge to any OCR system, particularly to a traditional system that uses a multi-step template-based matching procedure involving a relatively large number of templates. Thus, in one exemplary method in accordance with the disclosure, the first step involves the use of the grid-based template 300 to identify differences between the two characters on the basis of size. Such an approach takes advantage of the fact that in various fonts, and particularly in the OCR-B font, numerals have a different size in comparison to the letters of an alphabet.


More particularly, the OCR-B font is a monospaced font having a fixed distance spacing between adjacent characters and variable-height for the various characters. The rectangular shape of the grid-based template 300 is configured to accommodate numerals and letters that are aligned with respect to a common reference point (in this example, the common reference point corresponds to the coordinates 0,0 of the grid-based template 300), thereby allowing a comparison of centerline heights between two or more characters. The numerals can be quickly detected based on the fact that the centerline height of all numerals is greater than the centerline height of all letters of an alphabet when the OCR-B font is used. In some implementations in accordance with the disclosure, a primary search area can be defined as a circle having a diameter that is substantially equal to a width of the monospaced font or an oval having a minor axis that is substantially equal to a width of the monospaced font and a major axis that is substantially equal to a centerline height of a numeral in the monospaced font.


With respect to the two exemplary characters shown in FIG. 3, the numeral “9” has a centerline height that approximately corresponds to a numerical coordinate 128 on the y-axis of the grid-based template 300. In contrast, the letter “P” is shorter than the numeral “9” and has a centerline height that approximately corresponds to a numerical coordinate 118 on the y-axis of the grid-based template 300.


The software algorithm used for executing the grid-based template 300 can recognize that a portion of the numeral “9” is present in certain grid sections such as in grid section 305 and grid section 310, whereas no portion of the letter “P” (which is shorter than the numeral “9”) is present in either grid section 305 or grid section 310. More particularly, when using the exemplary grid-based template 300, none of the grid sections above the numerical coordinate 120 will contain any portion of a letter.


Thus, in accordance with the disclosure, the software algorithm is used to apply the grid-based template 300 to an image and rapidly differentiate between a numeral and a letter based on the centerline height difference between numerals and letters of a monospaced font. The differentiating procedure thus eliminates a large subset of characters from being considered as potential candidates for further processing.



FIG. 4 shows the grid-based template 300 when used to execute optical character recognition for identifying a specific letter of an alphabet in accordance with the disclosure. As described above, no portion of the character will be present in the grid sections above the numerical coordinate 120 in this example. Consequently, the software algorithm limits the processing to grid sections below the numerical coordinate 120 and can further limit the processing to a minimal set of grid locations (three grid sections, for example) in the grid sections below the numerical coordinate 120. The minimal set of grid locations can also be defined at least in part by a circle having a diameter that is substantially equal to a width of the monospaced font or an oval having a minor axis that is substantially equal to a width of the monospaced font and a major axis that is substantially equal to a centerline height of a numeral in the monospaced font.


The processing involves identifying a letter by searching for a presence of a portion of a number of candidate letters (a through z, when the letter is a part of an English alphabet) in various grid sections of the grid-based template 300. An inefficient way to carry out the search would involve a time-consuming scan of each and every grid section of the grid-based template 300. On the other hand, in accordance with the disclosure, the search is carried out by first eliminating from the search, grid sections that are known beforehand as areas in which any portion of any letter will not be present. The search is thus confined to areas where it is feasible that portions of any one of various letters may be present. The search is further narrowed to examine certain unique grid coordinate locations on the grid-based template 300 that would assist in quickly identifying a specific letter among all the potential candidates.


Accordingly, in one exemplary embodiment, the narrowed search procedure involves examining a group of three coordinate locations in a minimal group of grid sections along the y-axis of the grid-based template 300. The three coordinate locations (coordinate location 405, coordinate location 410, and coordinate location 415) provide information that assists in uniquely identifying a particular letter among a set of letters. The set of letters shown in exemplary FIG. 4, are A, J, M, and W. A portion of the letter A is present at the coordinate location 405 corresponding to (0, 118), which is unique to the letter A. No portion of J, M or W is present at the first coordinate location 405.


A portion of the letter W is present at a coordinate location 410 corresponding to (0, 68), which is unique to the letter W. No portion of A, J or M is present at the second coordinate location 410.


A portion of the letter J is present at the coordinate location 415 corresponding to (0, 8), which is unique to the letter J. No portion of A, M or W is present at the third coordinate location 415.


The letter M has no portion present at any of the coordinate location 405, the coordinate location 410, or the coordinate location 415.


In another exemplary embodiment, the narrowed search procedure involves examining a set of four grid sections located at four corners of the grid-based template 300. This set of four grid sections can provide additional information such as the presence of portions of each of multiple letters and/or an absence of one or more portions of one or more letters.


Upon completing the search procedure at the group of three exemplary coordinate locations, the software algorithm uses a lookup table to at least make a preliminary determination of the identity of the letter. The lookup table includes information indicating that the letter A is uniquely identifiable via the first coordinate location 405, the letter W is uniquely identifiable via the second coordinate location 410, the letter J is uniquely identifiable via the third coordinate location 415, and so on. Using a compact search procedure that is based on three unique coordinate locations (in this example) coupled with the use of a lookup table, allows for a fast recognition of various letters in accordance with the disclosure. In other exemplary search procedures, fewer or greater than three coordinate locations can be used. Furthermore, in some embodiments, the use of unique coordinate locations as described above, allows for execution of a search procedure for identifying a letter without necessarily first making a determination whether the character is a numeral or a letter.



FIG. 5 shows an exemplary list of coordinate locations in the grid-based template 300 that can be used to uniquely identify letters of the English alphabet in accordance with the disclosure. As can be understood from the list, some letters, such as F, P, R, B, E, for example, can be identified by using a combination of two or more unique coordinate locations because these letters cannot be uniquely identified by using a single coordinate location such as done for A, J, M, and W described above. Furthermore, certain letters such as K and M can be identified by using a default identification mode as no portion of any of these letters are present in the three coordinate locations. The default identification mode can be applied after completion of search for letters such as A, J, M, and W based on the three coordinate locations.



FIG. 6 shows the grid-based template 300 when used to execute optical character recognition for identifying a specific numeral in accordance with the disclosure. In a manner similar to that described above when using the grid-based template 300 to identify a specific letter, a search can be carried out in accordance with the disclosure to detect the presence of a numeral at certain unique coordinate locations and/or grid sections on the grid-based template 300.


Accordingly, in one exemplary embodiment, a search procedure is carried out by first eliminating from the search, grid sections that are known beforehand as areas in which any portion of any numeral will not be present. The search is thus confined to areas where any one of various numerals can be present. However, the search is further narrowed to first examine certain unique grid sections and/or grid coordinate locations where portions of one or more specific numerals may be present.


Accordingly, in one exemplary embodiment, the narrowed search procedure involves detecting the numeral “1” (indicated in a dashed line format) by examining four coordinate locations located in four specific grid sections that constitute a minimal group of grid sections in this case. The presence of a portion of a numeral at the coordinate location 605 (first coordinate location) provides a strong indication that the numeral can be a “1”. The identity of the numeral can be confirmed by examining three additional coordinate locations, which in this case correspond to the coordinate location 610, the coordinate location 615, and the coordinate location 620. The presence of other portions of the numeral at each additional coordinate location provides a continuously increasing level of confidence that the numeral is indeed a “1”. Thus, testing four coordinate locations at most provides a strong indication that the recognized numeral is a “1” without having to search additional areas of the grid-based template 300. Other numerals can be similarly recognized using fewer or more number of coordinate locations.


Upon completing the search procedure for the numeral “1” at the exemplary coordinate locations, the software algorithm uses a lookup table to at least make a preliminary determination of the identity of the numeral. The lookup table includes information indicating that the numeral “1” is uniquely identifiable via the four coordinates described above.


As another example, a narrowed search procedure for detecting the numeral “7” (indicated in a solid line format) can be carried out by first examining coordinate location 625. The presence of a portion of a numeral at the coordinate location 625 provides a strong indication that the numeral can be a “7”. The identity of the numeral can be confirmed by examining additional coordinate locations such as coordinate location 630. The lookup table includes information indicating that the numeral “7” is uniquely identifiable via the coordinate location 625 and/or the coordinate location 630.


Furthermore, in some embodiments, the use of unique coordinate locations as described above, allows for execution of a search procedure for identifying a numeral without necessarily first identifying whether the character is a numeral or a letter.



FIG. 7 shows a flowchart 700 of a method to execute optical character recognition in accordance with the disclosure. It is to be understood that any method steps or blocks shown in FIG. 7 represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the method. In certain implementations, one or more of the steps may be performed manually. It will be appreciated that, although particular example method steps are described below, additional steps or alternative steps may be utilized in various implementations without detracting from the spirit of the invention. Moreover, steps may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on various alternative implementations. Code may be also contained in one or more devices, and may not be necessarily confined to any one particular type of device.


Block 705 of the flowchart 700 pertains to using a grid-based template such as the grid-based template 300 described above, to first distinguish between the presence of one or more numerals in an image and one or more letters in the image. This action can be carried out as described above, by taking advantage of the characteristic that numerals defined in a monospaced font such as OCR-B are taller than letters that are also defined in the monospaced font. If a numeral cannot be distinguished from a letter (due to various reasons), the action indicated by block 705 assumes that the image contains numerals and the method step indicated in block 730 is executed. On the other hand, if a determination is made in block 705 that the image contains numerals and may contain letters as well, either one of the method step indicated in block 730 or the method step indicated in block 710 can be executed following execution of block 705.


Block 730 pertains to executing a numeral recognition procedure (as described above with reference to FIG. 6) using “n” coordinate locations in the grid-based template to uniquely identify one or more numerals. It should be understood that in some cases, “n” can be equal to 1, such as by examining only coordinate location 605 for detecting a portion of the numeral “1” with a certain level of confidence. The confidence level can be raised by confirming the identity of the numeral “1” by examining additional coordinate locations, such as coordinate location 610, coordinate location 615, and coordinate location 620. The improvement in confidence level is obtained at the expense of increased computation time. Consequently, the value of “n” that is selected for carrying out the action indicated in block 730 is based on a trade-off between confidence in recognition and speed of operation.


Block 735 pertains to using a look-up table to identify the detected data obtained by carrying out the action indicated in block 730. Thus, for example, the lookup table is used to identify the numeral “1” based on detecting the presence of a portion of the numeral at coordinate location 605 (and confirmed by the presence of other portions of the numeral at coordinate location 610, coordinate location 615, and/or coordinate location 620).


In block 740 a determination is made if additional numerals contained in the image are to be recognized. If yes, operation proceeds from block 740 back to block 730. If no, operation proceeds from block 740 to block 745.


Block 745 pertains to assembling information on recognized numerals obtained by executing the previous blocks (block 730, block 735, and block 740). At this point, in one exemplary implementation the action proceeds from block 745 to block 750 where the one or more recognized numerals are provided as a character recognition result. For example, the action indicated in block 750 can pertain to combining multiple recognized digits of a barcode label and providing the character recognition result to a computer for identifying an object upon which the barcode label is affixed.


However, in another exemplary implementation, when a character recognition procedure involves recognizing both letters and numerals, the action proceeds from block 745 to block 710 (as indicated by dashed line 746).


In yet another exemplary implementation, rather than proceeding from block 745 to block 710, the method step indicated in block 710 is executed following execution of block 705. Subsequent actions indicated in block 715, block 720, and block 725 for recognizing one or more letters can then be executed in parallel with actions indicated in block 730, block 735, block 740, and block 745 for recognizing one or more numerals.


Block 710 pertains to executing a letter recognition procedure (as described above with reference to FIG. 4 and FIG. 5) using “n” coordinate locations in the grid-based template to uniquely identify one or more letters. The value of “n” can be selected using similar criteria as described above with respect to block 730 for recognizing numerals. It should be therefore understood that the value of “n” selected for carrying out the action indicated in block 710 is based on a trade-off between confidence in recognition and speed of operation.


Block 715 pertains to using a look-up table to identify the detected data obtained by carrying out the action indicated in block 710. Thus, for example, the lookup table is used to identify the letter “A” based on detecting the presence of a portion of the letter at coordinate location 405.


In block 720 a determination is made if additional letters contained in the image are to be recognized. If yes, operation proceeds from block 720 back to block 710. If no, operation proceeds from block 720 to block 725.


Block 725 pertains to assembling information on recognized letters obtained by executing the previous blocks (block 710, block 715, and block 720). Action proceeds from block 725 to block 750 where the one or more recognized letters are combined with one or more recognized numerals (derived by executing actions indicated in block 745) and provided as a character recognition result.


The description above pertained to using “n” coordinate locations to identify one or more letters and/or one or more numerals with various levels of confidence. A single-step character recognition procedure in accordance with the disclosure, which will be described below in more detail, involves presetting “n” to a certain value so as to quickly and uniquely recognize any letter or numeral in a single step. The presetting can be carried out in various ways such as by using statistics to identify a suitable Hamming distance that is indicative of differences between various characters.



FIG. 8 shows an exemplary set of coordinate locations 805 (wherein “n” has been preset to an exemplary value equal to ten) and a look-up table 810 that can be used in conjunction with the set of coordinate locations 805 to uniquely identify any monospaced character using a single-step recognition procedure.


As can be understood from the set of coordinate locations 805, the first location of the ten locations corresponds to the coordinates (0,116) on a grid-based template such as the grid-based template 300 described above; the second location of the ten locations corresponds to the coordinates (0,64); the third location of the ten locations corresponds to the coordinates (0,7), and so on. The look-up table 810 includes uppercase letters (A to Z), numbers (0 to 9), and a “less than” symbol (“<”), each of which is formatted in an OCR-B font. The OCR-B font is based on a centerline drawing standard specified by the International Organization for Standardization (ISO), and the OCR-B character subset that is used in passports is defined by ICAO to include capital letters, numerals and the symbol “<” as enumerated in the look-up table 810.


A single-step recognition procedure in accordance with the disclosure involves using an OCR system (such as the OCR system 200 described above), to examine each of the ten locations identified in the of set of coordinate locations 805. Thus, for example, if a portion of a character is detected at a coordinate location (12, 90), the OCR software 206 utilizes the set of coordinate locations 805 to recognize this coordinate location as corresponding to location 8. Let it be assumed for purposes of example, that no other portion of the character is detected at any of the remaining nine locations in the set of coordinate locations 805. The OCR software 206 then utilizes the look-up table 810 to identify (via row 812) that the character to be recognized is the numeral “1”. On the other hand, if other portions of the character are detected, for example at locations 1, 2, 3, 9, and 10, the OCR software 206 utilizes the look-up table 810 to identify (via row 811) that the character is the letter “Z” and not the numeral “1”.


It will also be pertinent to point out that unlike the character recognition procedures described above with respect to FIG. 4 and FIG. 6 that distinguish between a numeral and a letter based on the centerline height difference between numerals and letters of a monospaced font, the single-step recognition procedure does not require examination of the centerline height difference between two or more characters. However, utilizing the single-step recognition procedure provides a savings in time in comparison to many traditional character recognition procedures and this savings in time can be optionally used to execute additional procedures such as using the grid-based template and/or applying statistics to confirm a character recognition result obtained via the single-step recognition procedure. Thus, for example, a grid-based template (such as the grid-based template 300) can be utilized to execute a character recognition procedure for confirming the identity of a character recognized by utilizing the single-step recognition procedure.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,424,768; 8,448,863;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. Nos. 8,727,223; 8,740,082;
  • U.S. Pat. Nos. 8,740,085; 8,746,563;
  • U.S. Pat. Nos. 8,750,445; 8,752,766;
  • U.S. Pat. Nos. 8,756,059; 8,757,495;
  • U.S. Pat. Nos. 8,760,563; 8,763,909;
  • U.S. Pat. Nos. 8,777,108; 8,777,109;
  • U.S. Pat. Nos. 8,779,898; 8,781,520;
  • U.S. Pat. Nos. 8,783,573; 8,789,757;
  • U.S. Pat. Nos. 8,789,758; 8,789,759;
  • U.S. Pat. Nos. 8,794,520; 8,794,522;
  • U.S. Pat. Nos. 8,794,525; 8,794,526;
  • U.S. Pat. Nos. 8,798,367; 8,807,431;
  • U.S. Pat. Nos. 8,807,432; 8,820,630;
  • U.S. Pat. Nos. 8,822,848; 8,824,692;
  • U.S. Pat. Nos. 8,824,696; 8,842,849;
  • U.S. Pat. Nos. 8,844,822; 8,844,823;
  • U.S. Pat. Nos. 8,849,019; 8,851,383;
  • U.S. Pat. Nos. 8,854,633; 8,866,963;
  • U.S. Pat. Nos. 8,868,421; 8,868,519;
  • U.S. Pat. Nos. 8,868,802; 8,868,803;
  • U.S. Pat. Nos. 8,870,074; 8,879,639;
  • U.S. Pat. Nos. 8,880,426; 8,881,983;
  • U.S. Pat. Nos. 8,881,987; 8,903,172;
  • U.S. Pat. Nos. 8,908,995; 8,910,870;
  • U.S. Pat. Nos. 8,910,875; 8,914,290;
  • U.S. Pat. Nos. 8,914,788; 8,915,439;
  • U.S. Pat. Nos. 8,915,444; 8,916,789;
  • U.S. Pat. Nos. 8,918,250; 8,918,564;
  • U.S. Pat. Nos. 8,925,818; 8,939,374;
  • U.S. Pat. Nos. 8,942,480; 8,944,313;
  • U.S. Pat. Nos. 8,944,327; 8,944,332;
  • U.S. Pat. Nos. 8,950,678; 8,967,468;
  • U.S. Pat. Nos. 8,971,346; 8,976,030;
  • U.S. Pat. Nos. 8,976,368; 8,978,981;
  • U.S. Pat. Nos. 8,978,983; 8,978,984;
  • U.S. Pat. Nos. 8,985,456; 8,985,457;
  • U.S. Pat. Nos. 8,985,459; 8,985,461;
  • U.S. Pat. Nos. 8,988,578; 8,988,590;
  • U.S. Pat. Nos. 8,991,704; 8,996,194;
  • U.S. Pat. Nos. 8,996,384; 9,002,641;
  • U.S. Pat. Nos. 9,007,368; 9,010,641;
  • U.S. Pat. Nos. 9,015,513; 9,016,576;
  • U.S. Pat. Nos. 9,022,288; 9,030,964;
  • U.S. Pat. Nos. 9,033,240; 9,033,242;
  • U.S. Pat. Nos. 9,036,054; 9,037,344;
  • U.S. Pat. Nos. 9,038,911; 9,038,915;
  • U.S. Pat. Nos. 9,047,098; 9,047,359;
  • U.S. Pat. Nos. 9,047,420; 9,047,525;
  • U.S. Pat. Nos. 9,047,531; 9,053,055;
  • U.S. Pat. Nos. 9,053,378; 9,053,380;
  • U.S. Pat. Nos. 9,058,526; 9,064,165;
  • U.S. Pat. Nos. 9,064,165; 9,064,167;
  • U.S. Pat. Nos. 9,064,168; 9,064,254;
  • U.S. Pat. Nos. 9,066,032; 9,070,032;
  • U.S. Pat. Nos. 9,076,459; 9,079,423;
  • U.S. Pat. Nos. 9,080,856; 9,082,023;
  • U.S. Pat. Nos. 9,082,031; 9,084,032;
  • U.S. Pat. Nos. 9,087,250; 9,092,681;
  • U.S. Pat. Nos. 9,092,682; 9,092,683;
  • U.S. Pat. Nos. 9,093,141; 9,098,763;
  • U.S. Pat. Nos. 9,104,929; 9,104,934;
  • U.S. Pat. Nos. 9,107,484; 9,111,159;
  • U.S. Pat. Nos. 9,111,166; 9,135,483;
  • U.S. Pat. Nos. 9,137,009; 9,141,839;
  • U.S. Pat. Nos. 9,147,096; 9,148,474;
  • U.S. Pat. Nos. 9,158,000; 9,158,340;
  • U.S. Pat. Nos. 9,158,953; 9,159,059;
  • U.S. Pat. Nos. 9,165,174; 9,171,543;
  • U.S. Pat. Nos. 9,183,425; 9,189,669;
  • U.S. Pat. Nos. 9,195,844; 9,202,458;
  • U.S. Pat. Nos. 9,208,366; 9,208,367;
  • U.S. Pat. Nos. 9,219,836; 9,224,024;
  • U.S. Pat. Nos. 9,224,027; 9,230,140;
  • U.S. Pat. Nos. 9,235,553; 9,239,950;
  • U.S. Pat. Nos. 9,245,492; 9,248,640;
  • U.S. Pat. Nos. 9,250,652; 9,250,712;
  • U.S. Pat. Nos. 9,251,411; 9,258,033;
  • U.S. Pat. Nos. 9,262,633; 9,262,660;
  • U.S. Pat. Nos. 9,262,662; 9,269,036;
  • U.S. Pat. Nos. 9,270,782; 9,274,812;
  • U.S. Pat. Nos. 9,275,388; 9,277,668;
  • U.S. Pat. Nos. 9,280,693; 9,286,496;
  • U.S. Pat. Nos. 9,298,964; 9,301,427;
  • U.S. Pat. Nos. 9,313,377; 9,317,037;
  • U.S. Pat. Nos. 9,319,548; 9,342,723;
  • U.S. Pat. Nos. 9,361,882; 9,365,381;
  • U.S. Pat. Nos. 9,373,018; 9,375,945;
  • U.S. Pat. Nos. 9,378,403; 9,383,848;
  • U.S. Pat. Nos. 9,384,374; 9,390,304;
  • U.S. Pat. Nos. 9,390,596; 9,411,386;
  • U.S. Pat. Nos. 9,412,242; 9,418,269;
  • U.S. Pat. Nos. 9,418,270; 9,465,967;
  • U.S. Pat. Nos. 9,423,318; 9,424,454;
  • U.S. Pat. Nos. 9,436,860; 9,443,123;
  • U.S. Pat. Nos. 9,443,222; 9,454,689;
  • U.S. Pat. Nos. 9,464,885; 9,465,967;
  • U.S. Pat. Nos. 9,478,983; 9,481,186;
  • U.S. Pat. Nos. 9,487,113; 9,488,986;
  • U.S. Pat. Nos. 9,489,782; 9,490,540;
  • U.S. Pat. Nos. 9,491,729; 9,497,092;
  • U.S. Pat. Nos. 9,507,974; 9,519,814;
  • U.S. Pat. Nos. 9,521,331; 9,530,038;
  • U.S. Pat. Nos. 9,572,901; 9,558,386;
  • U.S. Pat. Nos. 9,606,581; 9,646,189;
  • U.S. Pat. Nos. 9,646,191; 9,652,648;
  • U.S. Pat. Nos. 9,652,653; 9,656,487;
  • U.S. Pat. Nos. 9,659,198; 9,680,282;
  • U.S. Pat. Nos. 9,697,401; 9,701,140;
  • U.S. Design Pat. No. D702,237;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D734,339;
  • U.S. Design Pat. No. D737,321;
  • U.S. Design Pat. No. D754,205;
  • U.S. Design Pat. No. D754,206;
  • U.S. Design Pat. No. D757,009;
  • U.S. Design Pat. No. D760,719;
  • U.S. Design Pat. No. D762,604;
  • U.S. Design Pat. No. D766,244;
  • U.S. Design Pat. No. D777,166;
  • U.S. Design Pat. No. D771,631;
  • U.S. Design Pat. No. D783,601;
  • U.S. Design Pat. No. D785,617;
  • U.S. Design Pat. No. D785,636;
  • U.S. Design Pat. No. D790,505;
  • U.S. Design Pat. No. D790,546;
  • International Publication No. 2013/163789;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0194692;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0332996;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0191684;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0178685;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. Patent Application Publication No. 2015/0199957;
  • U.S. Patent Application Publication No. 2015/0210199;
  • U.S. Patent Application Publication No. 2015/0212565;
  • U.S. Patent Application Publication No. 2015/0213647;
  • U.S. Patent Application Publication No. 2015/0220753;
  • U.S. Patent Application Publication No. 2015/0220901;
  • U.S. Patent Application Publication No. 2015/0227189;
  • U.S. Patent Application Publication No. 2015/0236984;
  • U.S. Patent Application Publication No. 2015/0239348;
  • U.S. Patent Application Publication No. 2015/0242658;
  • U.S. Patent Application Publication No. 2015/0248572;
  • U.S. Patent Application Publication No. 2015/0254485;
  • U.S. Patent Application Publication No. 2015/0261643;
  • U.S. Patent Application Publication No. 2015/0264624;
  • U.S. Patent Application Publication No. 2015/0268971;
  • U.S. Patent Application Publication No. 2015/0269402;
  • U.S. Patent Application Publication No. 2015/0288689;
  • U.S. Patent Application Publication No. 2015/0288896;
  • U.S. Patent Application Publication No. 2015/0310243;
  • U.S. Patent Application Publication No. 2015/0310244;
  • U.S. Patent Application Publication No. 2015/0310389;
  • U.S. Patent Application Publication No. 2015/0312780;
  • U.S. Patent Application Publication No. 2015/0327012;
  • U.S. Patent Application Publication No. 2016/0014251;
  • U.S. Patent Application Publication No. 2016/0025697;
  • U.S. Patent Application Publication No. 2016/0026838;
  • U.S. Patent Application Publication No. 2016/0026839;
  • U.S. Patent Application Publication No. 2016/0040982;
  • U.S. Patent Application Publication No. 2016/0042241;
  • U.S. Patent Application Publication No. 2016/0057230;
  • U.S. Patent Application Publication No. 2016/0062473;
  • U.S. Patent Application Publication No. 2016/0070944;
  • U.S. Patent Application Publication No. 2016/0092805;
  • U.S. Patent Application Publication No. 2016/0101936;
  • U.S. Patent Application Publication No. 2016/0104019;
  • U.S. Patent Application Publication No. 2016/0104274;
  • U.S. Patent Application Publication No. 2016/0109219;
  • U.S. Patent Application Publication No. 2016/0109220;
  • U.S. Patent Application Publication No. 2016/0109224;
  • U.S. Patent Application Publication No. 2016/0112631;
  • U.S. Patent Application Publication No. 2016/0112643;
  • U.S. Patent Application Publication No. 2016/0117627;
  • U.S. Patent Application Publication No. 2016/0124516;
  • U.S. Patent Application Publication No. 2016/0125217;
  • U.S. Patent Application Publication No. 2016/0125342;
  • U.S. Patent Application Publication No. 2016/0125873;
  • U.S. Patent Application Publication No. 2016/0133253;
  • U.S. Patent Application Publication No. 2016/0171597;
  • U.S. Patent Application Publication No. 2016/0171666;
  • U.S. Patent Application Publication No. 2016/0171720;
  • U.S. Patent Application Publication No. 2016/0171775;
  • U.S. Patent Application Publication No. 2016/0171777;
  • U.S. Patent Application Publication No. 2016/0174674;
  • U.S. Patent Application Publication No. 2016/0178479;
  • U.S. Patent Application Publication No. 2016/0178685;
  • U.S. Patent Application Publication No. 2016/0178707;
  • U.S. Patent Application Publication No. 2016/0179132;
  • U.S. Patent Application Publication No. 2016/0179143;
  • U.S. Patent Application Publication No. 2016/0179368;
  • U.S. Patent Application Publication No. 2016/0179378;
  • U.S. Patent Application Publication No. 2016/0180130;
  • U.S. Patent Application Publication No. 2016/0180133;
  • U.S. Patent Application Publication No. 2016/0180136;
  • U.S. Patent Application Publication No. 2016/0180594;
  • U.S. Patent Application Publication No. 2016/0180663;
  • U.S. Patent Application Publication No. 2016/0180678;
  • U.S. Patent Application Publication No. 2016/0180713;
  • U.S. Patent Application Publication No. 2016/0185136;
  • U.S. Patent Application Publication No. 2016/0185291;
  • U.S. Patent Application Publication No. 2016/0186926;
  • U.S. Patent Application Publication No. 2016/0188861;
  • U.S. Patent Application Publication No. 2016/0188939;
  • U.S. Patent Application Publication No. 2016/0188940;
  • U.S. Patent Application Publication No. 2016/0188941;
  • U.S. Patent Application Publication No. 2016/0188942;
  • U.S. Patent Application Publication No. 2016/0188943;
  • U.S. Patent Application Publication No. 2016/0188944;
  • U.S. Patent Application Publication No. 2016/0189076;
  • U.S. Patent Application Publication No. 2016/0189087;
  • U.S. Patent Application Publication No. 2016/0189088;
  • U.S. Patent Application Publication No. 2016/0189092;
  • U.S. Patent Application Publication No. 2016/0189284;
  • U.S. Patent Application Publication No. 2016/0189288;
  • U.S. Patent Application Publication No. 2016/0189366;
  • U.S. Patent Application Publication No. 2016/0189443;
  • U.S. Patent Application Publication No. 2016/0189447;
  • U.S. Patent Application Publication No. 2016/0189489;
  • U.S. Patent Application Publication No. 2016/0192051;
  • U.S. Patent Application Publication No. 2016/0202951;
  • U.S. Patent Application Publication No. 2016/0202958;
  • U.S. Patent Application Publication No. 2016/0202959;
  • U.S. Patent Application Publication No. 2016/0203021;
  • U.S. Patent Application Publication No. 2016/0203429;
  • U.S. Patent Application Publication No. 2016/0203797;
  • U.S. Patent Application Publication No. 2016/0203820;
  • U.S. Patent Application Publication No. 2016/0204623;
  • U.S. Patent Application Publication No. 2016/0204636;
  • U.S. Patent Application Publication No. 2016/0204638;
  • U.S. Patent Application Publication No. 2016/0227912;
  • U.S. Patent Application Publication No. 2016/0232891;
  • U.S. Patent Application Publication No. 2016/0292477;
  • U.S. Patent Application Publication No. 2016/0294779;
  • U.S. Patent Application Publication No. 2016/0306769;
  • U.S. Patent Application Publication No. 2016/0314276;
  • U.S. Patent Application Publication No. 2016/0314294;
  • U.S. Patent Application Publication No. 2016/0316190;
  • U.S. Patent Application Publication No. 2016/0323310;
  • U.S. Patent Application Publication No. 2016/0325677;
  • U.S. Patent Application Publication No. 2016/0327614;
  • U.S. Patent Application Publication No. 2016/0327930;
  • U.S. Patent Application Publication No. 2016/0328762;
  • U.S. Patent Application Publication No. 2016/0330218;
  • U.S. Patent Application Publication No. 2016/0343163;
  • U.S. Patent Application Publication No. 2016/0343176;
  • U.S. Patent Application Publication No. 2016/0364914;
  • U.S. Patent Application Publication No. 2016/0370220;
  • U.S. Patent Application Publication No. 2016/0372282;
  • U.S. Patent Application Publication No. 2016/0373847;
  • U.S. Patent Application Publication No. 2016/0377414;
  • U.S. Patent Application Publication No. 2016/0377417;
  • U.S. Patent Application Publication No. 2017/0010141;
  • U.S. Patent Application Publication No. 2017/0010328;
  • U.S. Patent Application Publication No. 2017/0010780;
  • U.S. Patent Application Publication No. 2017/0016714;
  • U.S. Patent Application Publication No. 2017/0018094;
  • U.S. Patent Application Publication No. 2017/0046603;
  • U.S. Patent Application Publication No. 2017/0047864;
  • U.S. Patent Application Publication No. 2017/0053146;
  • U.S. Patent Application Publication No. 2017/0053147;
  • U.S. Patent Application Publication No. 2017/0053647;
  • U.S. Patent Application Publication No. 2017/0055606;
  • U.S. Patent Application Publication No. 2017/0060316;
  • U.S. Patent Application Publication No. 2017/0061961;
  • U.S. Patent Application Publication No. 2017/0064634;
  • U.S. Patent Application Publication No. 2017/0083730;
  • U.S. Patent Application Publication No. 2017/0091502;
  • U.S. Patent Application Publication No. 2017/0091706;
  • U.S. Patent Application Publication No. 2017/0091741;
  • U.S. Patent Application Publication No. 2017/0091904;
  • U.S. Patent Application Publication No. 2017/0092908;
  • U.S. Patent Application Publication No. 2017/0094238;
  • U.S. Patent Application Publication No. 2017/0098947;
  • U.S. Patent Application Publication No. 2017/0100949;
  • U.S. Patent Application Publication No. 2017/0108838;
  • U.S. Patent Application Publication No. 2017/0108895;
  • U.S. Patent Application Publication No. 2017/0118355;
  • U.S. Patent Application Publication No. 2017/0123598;
  • U.S. Patent Application Publication No. 2017/0124369;
  • U.S. Patent Application Publication No. 2017/0124396;
  • U.S. Patent Application Publication No. 2017/0124687;
  • U.S. Patent Application Publication No. 2017/0126873;
  • U.S. Patent Application Publication No. 2017/0126904;
  • U.S. Patent Application Publication No. 2017/0139012;
  • U.S. Patent Application Publication No. 2017/0140329;
  • U.S. Patent Application Publication No. 2017/0140731;
  • U.S. Patent Application Publication No. 2017/0147847;
  • U.S. Patent Application Publication No. 2017/0150124;
  • U.S. Patent Application Publication No. 2017/0169198;
  • U.S. Patent Application Publication No. 2017/0171035;
  • U.S. Patent Application Publication No. 2017/0171703;
  • U.S. Patent Application Publication No. 2017/0171803;
  • U.S. Patent Application Publication No. 2017/0180359;
  • U.S. Patent Application Publication No. 2017/0180577;
  • U.S. Patent Application Publication No. 2017/0181299;
  • U.S. Patent Application Publication No. 2017/0190192;
  • U.S. Patent Application Publication No. 2017/0193432;
  • U.S. Patent Application Publication No. 2017/0193461;
  • U.S. Patent Application Publication No. 2017/0193727;
  • U.S. Patent Application Publication No. 2017/0199266;
  • U.S. Patent Application Publication No. 2017/0200108; and
  • U.S. Patent Application Publication No. 2017/0200275.


In the specification and/or figures, exemplary embodiments of the invention have been disclosed. The present disclosure is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A system, comprising: a processor; anda non-transitory memory that stores executable instructions that, when executed by the processor, cause the processor to: apply a bounding box to a character contained in a barcode label;designate a location of the bounding box as an identification area in the bounding box, wherein, in response to the character being present in the bounding box, the identification area is configured to contain a portion of the character and exclude all portions of all other characters in a set of characters associated with the bounding box; andperform optical character recognition with respect to the identification area to detect a portion of the character that uniquely identifies the character from among the set of characters.
  • 2. The system of claim 1, wherein the executable instructions further cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 3. The system of claim 2, wherein the executable instructions further cause the processor to: identify the character contained in the bounding box based on the primary search area.
  • 4. The system of claim 2, wherein the executable instructions further cause the processor to: define a set of locations inside the primary search area, wherein a location of the set of locations is identifiable by coordinates of an x-y mapping system.
  • 5. The system of claim 1, wherein the executable instructions further cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area shaped as a circle that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 6. The system of claim 1, wherein the executable instructions further cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area shaped as an oval that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 7. The system of claim 1, wherein the character comprises a monospaced font and comprises any one of a numeral, a letter, or a mathematical symbol.
  • 8. The system of claim 1, wherein the bounding box comprises a grid pattern that is characterized using coordinates of an x-y mapping system.
  • 9. A computer-implemented method, comprising: applying, by a device comprising a processor, a bounding box to a character contained in a barcode label;designating, by the device, a location of the bounding box as an identification area in the bounding box, wherein, in response to the character being present in the bounding box, the identification area is configured to contain a portion of the character and exclude all portions of all other characters in a set of characters associated with the bounding box; andperforming, by the device, optical character recognition with respect to the identification area to detect a portion of the character that uniquely identifies the character from among the set of characters.
  • 10. The computer-implemented method of claim 9, further comprising: applying, by the device, a grid-based template to the bounding box; anddefining, by the device, a portion of the grid-based template as a primary search area, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 11. The computer-implemented method of claim 10, further comprising: identifying, by the device, the character contained in the bounding box based on the primary search area.
  • 12. The computer-implemented method of claim 10, further comprising: defining, by the device, a set of locations inside the primary search area, wherein a location of the set of locations is identifiable by coordinates of an x-y mapping system.
  • 13. The computer-implemented method of claim 9, further comprising: applying, by the device, a grid-based template to the bounding box; anddefining, by the device, a portion of the grid-based template as a primary search area shaped as a circle that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 14. The computer-implemented method of claim 9, further comprising: applying, by the device, a grid-based template to the bounding box; anddefining, by the device, a portion of the grid-based template as a primary search area shaped as an oval that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 15. A computer program product comprising at least one non-transitory computer-readable storage medium having program instructions embodied thereon, the program instructions executable by a processor to cause the processor to: apply a bounding box to a character contained in a barcode label;designate a location of the bounding box as an identification area in the bounding box, wherein, in response to the character being present in the bounding box, the identification area is configured to contain a portion of the character and exclude all portions of all other characters in a set of characters associated with the bounding box; andperform optical character recognition with respect to the identification area to detect a portion of the character that uniquely identifies the character from among the set of characters.
  • 16. The computer program product of claim 15, wherein the program instructions are executable by the processor to cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 17. The computer program product of claim 16, wherein the program instructions are executable by the processor to cause the processor to: identify the character contained in the bounding box based on the primary search area.
  • 18. The computer program product of claim 16, wherein the program instructions are executable by the processor to cause the processor to: define a set of locations inside the primary search area, wherein a location of the set of locations is identifiable by coordinates of an x-y mapping system.
  • 19. The computer program product of claim 15, wherein the program instructions are executable by the processor to cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area shaped as a circle that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
  • 20. The computer program product of claim 15, wherein the program instructions are executable by the processor to cause the processor to: apply a grid-based template to the bounding box; anddefine a portion of the grid-based template as a primary search area shaped as an oval that comprises a diameter that is substantially equal to a width of a monospaced font, wherein the primary search area is an area within the bounding box and is smaller than the bounding box.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of and claims priority to U.S. patent application Ser. No. 15/793,407, titled “OPTICAL CHARACTER RECOGNITION SYSTEMS AND METHODS,” and filed on Oct. 25, 2017, the entire content of which is hereby incorporated by reference.

US Referenced Citations (704)
Number Name Date Kind
4468809 Grabowski et al. Aug 1984 A
5077809 Ghazizadeh Dec 1991 A
5208869 Holt May 1993 A
5367578 Golem Nov 1994 A
5539840 Krtolica Jul 1996 A
5835634 Abrams Nov 1998 A
5852685 Shepard Dec 1998 A
5880451 Smith Mar 1999 A
5915039 Lorie Jun 1999 A
6038342 Bernzott Mar 2000 A
6259814 Krtolica Jul 2001 B1
6321986 Ackley Nov 2001 B1
6337924 Smith Jan 2002 B1
6366696 Hertz Apr 2002 B1
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8401299 Nakamura Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Van et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein, Jr. Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre, Jr. Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz, Sr. Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8755604 Gross et al. Jun 2014 B1
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue et al. Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein, Jr. Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 El et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9082031 Liu et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
D737321 Lee Aug 2015 S
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham, IV Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein, Jr. Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9195907 Longacre, Jr. Nov 2015 B1
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Smith Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
D754205 Nguyen et al. Apr 2016 S
D754206 Nguyen et al. Apr 2016 S
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey et al. May 2016 B2
9360304 Xue et al. Jun 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390304 Chang et al. Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein, Jr. Aug 2016 B2
9412242 Van et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van et al. Aug 2016 B2
9423318 Liu et al. Aug 2016 B2
9424454 Tao et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9436860 Smith et al. Sep 2016 B2
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9487113 Schukalski Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
10679101 Ackley Jun 2020 B2
20020113125 Schuessler Aug 2002 A1
20040047508 Anisimovich Mar 2004 A1
20040074967 Takakura Apr 2004 A1
20070051814 Ehrhart Mar 2007 A1
20070063048 Havens et al. Mar 2007 A1
20080137955 Tsai Jun 2008 A1
20080185432 Caballero et al. Aug 2008 A1
20080310765 Reichenbach Dec 2008 A1
20090134221 Zhu et al. May 2009 A1
20090161930 Zahniser Jun 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20100310172 Natarajan Dec 2010 A1
20110022940 King Jan 2011 A1
20110134458 Kojima Jun 2011 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110294543 Lapstun Dec 2011 A1
20120087551 Bhagwan Apr 2012 A1
20120111946 Golant May 2012 A1
20120128251 Petrou May 2012 A1
20120168511 Kotlarsky et al. Jul 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120194692 Mers et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120228382 Havens et al. Sep 2012 A1
20120248188 Kearney Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130077856 Ferro Mar 2013 A1
20130082104 Kearney et al. Apr 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedrao Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130332996 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein, Jr. Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140037181 Koo Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140064618 Janssen, Jr. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140097249 Gomez et al. Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140106725 Sauerwein, Jr. Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191684 Valois Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 Digregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150023599 Geva Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150039878 Barten Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178537 El et al. Jun 2015 A1
20150178685 Krumel et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150212565 Murawski et al. Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150220901 Gomez et al. Aug 2015 A1
20150227189 Davis et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150239348 Chamberlin Aug 2015 A1
20150242658 Nahill et al. Aug 2015 A1
20150248572 Soule et al. Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150264624 Wang et al. Sep 2015 A1
20150268971 Barten Sep 2015 A1
20150269402 Barber et al. Sep 2015 A1
20150288689 Todeschini et al. Oct 2015 A1
20150288896 Wang Oct 2015 A1
20150310243 Ackley et al. Oct 2015 A1
20150310244 Xian et al. Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150339526 Macciola Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160025697 Alt et al. Jan 2016 A1
20160026838 Gillet et al. Jan 2016 A1
20160026839 Qu et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160070944 McCloskey et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160125873 Braho et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Franz Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini et al. Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 Dipiazza et al. Jun 2016 A1
20160192051 Dipiazza et al. Jun 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggerty et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Wilz et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160316190 McCloskey et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Germaine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress, Jr. Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 D'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Celinder et al. Jul 2017 A1
20170193727 Van et al. Jul 2017 A1
20170199266 Rice et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
20170220886 Canero Morales Aug 2017 A1
20170235991 Unemyr et al. Aug 2017 A1
20170244850 Ishitori Aug 2017 A1
20170270508 Roach Sep 2017 A1
20170293788 Taira Oct 2017 A1
20170293817 Bonch-Osmolovsky Oct 2017 A1
20180336226 Anorga Nov 2018 A1
Foreign Referenced Citations (1)
Number Date Country
2013163789 Nov 2013 WO
Non-Patent Literature Citations (10)
Entry
Communication Pursuant to Article 94(3) issued in European Application No. 18202437.2, dated Jun. 22, 2021, 4 pages.
Bigelow, Oh, oh, zero!, TUGboat, vol. 34, No. 2, Aug. 1, 2013, pp. 168-181 [Cited in EP Search Report].
European Search Report and Search Opinion Received for EP Application No. 18202437.2, dated Mar. 13, 2019, 8 pages.
Examiner initiated interview summary (PTOL-413B) dated Feb. 5, 2020 for U.S. Appl. No. 15/793,407.
Final Rejection dated Nov. 15, 2019 for U.S. Appl. No. 15/793,407.
Frutiger, “OCR-B: A Standardized Character for Optical Recognition”, The Journal of Typographic Research, vol. 1, No. 2, Apr. 1, 1967, pp. 137-146 [Cited in EP Search Report].
Non-Final Rejection dated May 3, 2019 for U.S. Appl. No. 15/793,407.
Notice of Allowance and Fees Due (PTOL-85) dated Feb. 5, 2020 for U.S. Appl. No. 15/793,407.
Rule 70 (2) Communication for European Application No. 18202437.2, dated May 5, 2019, 2 pages.
Search Report in related European Application No. 18202437.2 dated Mar. 19, 2019, pp. 1-9.
Related Publications (1)
Number Date Country
20200311478 A1 Oct 2020 US
Continuations (1)
Number Date Country
Parent 15793407 Oct 2017 US
Child 16867318 US