OPTICAL CHARACTER RECOGNITION SYSTEMS AND METHODS

Information

  • Patent Application
  • 20190122079
  • Publication Number
    20190122079
  • Date Filed
    October 25, 2017
    6 years ago
  • Date Published
    April 25, 2019
    5 years ago
Abstract
The present disclosure is generally directed to systems and methods for executing optical character recognition faster than at least some traditional OCR systems, without sacrificing recognition accuracy. Towards this end, various exemplary embodiments involve the use of a bounding box and a grid-based template to identify certain unique aspects of each of various characters and/or numerals. For example, in one embodiment, the grid-based template can be used to recognize a numeral and/or a character based on a difference in centerline height between the numeral and the character when a monospaced font is used. In another exemplary embodiment, the grid-based template can be used to recognize an individual digit among a plurality of digits based on certain parts of the individual digit being uniquely located in specific portions of the grid-based template.
Description
FIELD OF THE INVENTION

The present invention generally relates to optical character recognition systems and more particularly relates to systems and methods for improving recognition speed.


BACKGROUND

Traditional optical character recognition (OCR) systems often tend to sacrifice speed in the interests of ensuring accuracy in character recognition. The traditional character recognition process typically incorporates a template to execute a character-by-character recognition of various characters. The template, which can be one of a number of different types of templates, is associated with a pattern-matching algorithm that identifies a specific numeral or a letter of an alphabet. Certain characters such as the numeral zero and the letter “O” are relatively similar to each other. Consequently, the process of using the pattern-matching algorithm tends to be slow in order to ensure that such characters are not misinterpreted. However, it is desirable to provide systems and methods that provide for faster optical character recognition without sacrificing accuracy.


SUMMARY

In an exemplary embodiment in accordance with the disclosure, a method includes using an optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a grid-based template to a character having a monospaced font; defining in the grid-based template, a first grid section that includes a first portion of the character when the character has a first size and excludes the first portion of the character when the character has a second size that is smaller than the first size; and recognizing the character as a numeral when the first grid section includes the first portion of the character.


In another exemplary embodiment in accordance with the disclosure, a method includes providing to an optical character recognition system, a barcode label containing a plurality of digits, and using the optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a bounding box to an individual digit among the plurality of digits contained in the barcode label; applying a grid-based template to the bounding box, the grid-based template comprising a plurality of grid sections; and using the plurality of grid sections to identify the individual digit contained in the bounding box.


In yet another exemplary embodiment in accordance with the disclosure, a method includes using an optical character recognition system to execute an optical character recognition procedure. The optical character recognition procedure includes applying a bounding box to a character; applying a grid-based template to the bounding box; defining a portion of the grid-based template as a primary search area; and using at least the primary search area to identify the character contained in the bounding box.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages described in this disclosure, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts an exemplary embodiment of an OCR system in accordance with the disclosure.



FIG. 2 schematically depicts another exemplary embodiment of an OCR system in accordance with the disclosure.



FIG. 3 shows an exemplary grid-based template that can be used to execute optical character recognition in accordance with the disclosure.



FIG. 4 shows an exemplary grid-based template that can be used to execute optical character recognition for identifying a specific letter of an alphabet in accordance with the disclosure.



FIG. 5 shows an exemplary list of coordinate locations in a grid-based template that can be used to uniquely identify letters of the English alphabet in accordance with the disclosure.



FIG. 6 shows an exemplary grid-based template when used to execute optical character recognition for identifying a specific numeral in accordance with the disclosure.



FIG. 7 shows a flowchart of a method to execute optical character recognition in accordance with the disclosure.



FIG. 8 shows an exemplary set of coordinate locations and a look-up table that can be used in conjunction with the set of coordinate locations to uniquely identify any character using a single-step recognition procedure in accordance with the disclosure.





DETAILED DESCRIPTION

Throughout this description, embodiments and variations are described for the purpose of illustrating uses and implementations of inventive concepts. The illustrative description should be understood as presenting examples of inventive concepts, rather than as limiting the scope of the concepts as disclosed herein. Towards this end, certain words and terms are used herein solely for convenience and such words and terms should be broadly understood as encompassing various objects and actions that are generally understood in various forms and equivalencies by persons of ordinary skill in the art. For example, the word “numeral” as used herein is equally applicable to other words such as “digit” and “number.” The word “character” as used herein pertains to any printed or written material that is recognizable using optical character recognition techniques. It should also be understood that the word “example” as used herein is intended to be non-exclusionary and non-limiting in nature. More particularly, the word “exemplary” as used herein indicates one among several examples and it should be understood that no special emphasis, exclusivity, or preference, is associated or implied by the use of this word.


The present disclosure is generally directed to systems and methods for executing optical character recognition faster than at least some traditional OCR systems, without sacrificing recognition accuracy. Towards this end, various exemplary embodiments involve the use of a bounding box and a grid-based template to identify certain unique aspects of each of various characters and/or numerals. For example, in one embodiment, the grid-based template can be used to recognize a numeral and/or a character based on a difference in centerline height between the numeral and the character when a monospaced font is used. In another exemplary embodiment, the grid-based template can be used to recognize an individual digit among a plurality of digits based on certain parts of the individual digit being uniquely located in specific portions of the grid-based template.



FIG. 1 schematically depicts a depicts an exemplary OCR system 100 in accordance with the disclosure. The OCR system 100 depicted in this exemplary embodiment includes an image scanning system 110 (a flat-bed scanner in this example) communicatively coupled to a computer 105. In other embodiments, various other hardware elements such as a handheld scanner or an overhead scanner can constitute the image scanning system 110.


When in operation, the image scanning system 110 captures an image of characters and text located on an object 107 such as a printed sheet of paper or a machine-readable zone (MRZ) on a passport. The captured image is provided to the computer 105 via a communication link 106 (wire, communications cable, wireless etc.). The computer 105 includes OCR software that is used to carry out OCR operations upon the captured image in accordance with the disclosure.


In an alternative embodiment, the computer 105 can be omitted and the OCR software incorporated into the image scanning system 110, which operates as a multifunction unit to execute various operations such as scanning, printing, faxing, and OCR.


In yet another alternative embodiment, the image scanning system 110 can be omitted and the computer 105 configured to generate a document and/or receive a document via a communication network such as the Internet. OCR software contained in the computer 105 in accordance with the disclosure can then be used to carry out OCR operations upon the received/generated document.



FIG. 2 schematically depicts an exemplary OCR system 200 in accordance with the disclosure. OCR system 200 includes an image scanning system 210 communicatively coupled to a processing system 205 via a communications link 207 (such as a wire, a communications cable, a wireless link, or a metal track on a printed circuit board). The image scanning system 210 includes a light source 211 that projects light through a transparent window 213 upon an object 214. The object 214, which can be a sheet of paper containing text and/or images, reflects the light towards an image sensor 212. The image sensor 212, which contains light sensing elements such as photodiodes and/or photocells, converts the received light into electrical signals (digital bits for example) that are transmitted to the OCR software 206 contained in the processing system 205. In one example embodiment, the OCR system 200 is a slot scanner incorporating a linear array of photocells. The OCR software 206 that is a part of the processing system 205 can be used in accordance with the disclosure to operate upon the electrical signals for performing optical character recognition of the material printed upon the object 214.



FIG. 3 shows a first exemplary grid-based template 300 that can be used to execute optical character recognition in accordance with the disclosure. The exemplary grid-based template 300, which is executed in the form of a software algorithm, has a rectangular shape with an x-axis having a first set of numerical coordinates (ranging from −40 to +40 in this example) and a y-axis having a second set of numerical coordinates (ranging from 0 to 140 in this example). The numerical coordinates of this x-y mapping system can be used to define various grid sections in the grid-based template 300 (grid section 305, grid section 310, grid section 315, etc.).


More particularly, in accordance with the disclosure, the grid-based template 300 can be used to perform character recognition upon various text characters such as a numeral “9” and a letter “P” of an alphabet that are shown as examples. In one or more exemplary implementations, the characters can conform to what is known in the industry as an OCR font. One specific OCR font that is popular in the industry is an OCR-B font. The OCR-B font, which resembles an Ariel font to some extent, is used on various items such as passports, car license plates, and as human-readable characters in barcode symbols (European Article Number (EAN) barcodes and Universal Product Code (U.P.C.) barcodes for example).


The OCR-B font has been specifically tailored to minimize machine reading errors. However, even with such tailoring, it is often quite time-consuming for traditional OCR systems to execute character recognition, because each character has to be recognized using a multi-step template-based matching procedure involving the use of a number of templates. For example, the International Civil Aviation Organization (ICAO) uses thirty-seven templates for identifying various characters in a passport. Thus, for a symbol containing 20 characters, a traditional multi-step template-based matching procedure can involve executing 740 match attempts using the thirty-seven templates (20×37=740).


The use of the grid-based template 300 to perform character recognition in accordance with the disclosure will now be described. The similarity in shapes between these two exemplary characters (“9” and “P”) poses a challenge to any OCR system, particularly to a traditional system that uses a multi-step template-based matching procedure involving a relatively large number of templates. Thus, in one exemplary method in accordance with the disclosure, the first step involves the use of the grid-based template 300 to identify differences between the two characters on the basis of size. Such an approach takes advantage of the fact that in various fonts, and particularly in the OCR-B font, numerals have a different size in comparison to the letters of an alphabet.


More particularly, the OCR-B font is a monospaced font having a fixed distance spacing between adjacent characters and variable-height for the various characters. The rectangular shape of the grid-based template 300 is configured to accommodate numerals and letters that are aligned with respect to a common reference point (in this example, the common reference point corresponds to the coordinates 0,0 of the grid-based template 300), thereby allowing a comparison of centerline heights between two or more characters. The numerals can be quickly detected based on the fact that the centerline height of all numerals is greater than the centerline height of all letters of an alphabet when the OCR-B font is used. In some implementations in accordance with the disclosure, a primary search area can be defined as a circle having a diameter that is substantially equal to a width of the monospaced font or an oval having a minor axis that is substantially equal to a width of the monospaced font and a major axis that is substantially equal to a centerline height of a numeral in the monospaced font.


With respect to the two exemplary characters shown in FIG. 3, the numeral “9” has a centerline height that approximately corresponds to a numerical coordinate 128 on the y-axis of the grid-based template 300. In contrast, the letter “P” is shorter than the numeral “9” and has a centerline height that approximately corresponds to a numerical coordinate 118 on the y-axis of the grid-based template 300.


The software algorithm used for executing the grid-based template 300 can recognize that a portion of the numeral “9” is present in certain grid sections such as in grid section 305 and grid section 310, whereas no portion of the letter “P” (which is shorter than the numeral “9”) is present in either grid section 305 or grid section 310. More particularly, when using the exemplary grid-based template 300, none of the grid sections above the numerical coordinate 120 will contain any portion of a letter.


Thus, in accordance with the disclosure, the software algorithm is used to apply the grid-based template 300 to an image and rapidly differentiate between a numeral and a letter based on the centerline height difference between numerals and letters of a monospaced font. The differentiating procedure thus eliminates a large subset of characters from being considered as potential candidates for further processing.



FIG. 4 shows the grid-based template 300 when used to execute optical character recognition for identifying a specific letter of an alphabet in accordance with the disclosure. As described above, no portion of the character will be present in the grid sections above the numerical coordinate 120 in this example. Consequently, the software algorithm limits the processing to grid sections below the numerical coordinate 120 and can further limit the processing to a minimal set of grid locations (three grid sections, for example) in the grid sections below the numerical coordinate 120. The minimal set of grid locations can also be defined at least in part by a circle having a diameter that is substantially equal to a width of the monospaced font or an oval having a minor axis that is substantially equal to a width of the monospaced font and a major axis that is substantially equal to a centerline height of a numeral in the monospaced font.


The processing involves identifying a letter by searching for a presence of a portion of a number of candidate letters (a through z, when the letter is a part of an English alphabet) in various grid sections of the grid-based template 300. An inefficient way to carry out the search would involve a time-consuming scan of each and every grid section of the grid-based template 300. On the other hand, in accordance with the disclosure, the search is carried out by first eliminating from the search, grid sections that are known beforehand as areas in which any portion of any letter will not be present. The search is thus confined to areas where it is feasible that portions of any one of various letters may be present. The search is further narrowed to examine certain unique grid coordinate locations on the grid-based template 300 that would assist in quickly identifying a specific letter among all the potential candidates.


Accordingly, in one exemplary embodiment, the narrowed search procedure involves examining a group of three coordinate locations in a minimal group of grid sections along the y-axis of the grid-based template 300. The three coordinate locations (coordinate location 405, coordinate location 410, and coordinate location 415) provide information that assists in uniquely identifying a particular letter among a set of letters. The set of letters shown in exemplary FIG. 4, are A, J, M, and W. A portion of the letter A is present at the coordinate location 405 corresponding to (0, 118), which is unique to the letter A. No portion of J, M or W is present at the first coordinate location 405.


A portion of the letter W is present at a coordinate location 410 corresponding to (0, 68), which is unique to the letter W. No portion of A, J or M is present at the second coordinate location 410.


A portion of the letter J is present at the coordinate location 415 corresponding to (0, 8), which is unique to the letter J. No portion of A, M or W is present at the third coordinate location 415.


The letter M has no portion present at any of the coordinate location 405, the coordinate location 410, or the coordinate location 415.


In another exemplary embodiment, the narrowed search procedure involves examining a set of four grid sections located at four corners of the grid-based template 300. This set of four grid sections can provide additional information such as the presence of portions of each of multiple letters and/or an absence of one or more portions of one or more letters.


Upon completing the search procedure at the group of three exemplary coordinate locations, the software algorithm uses a lookup table to at least make a preliminary determination of the identity of the letter. The lookup table includes information indicating that the letter A is uniquely identifiable via the first coordinate location 405, the letter W is uniquely identifiable via the second coordinate location 410, the letter J is uniquely identifiable via the third coordinate location 415, and so on. Using a compact search procedure that is based on three unique coordinate locations (in this example) coupled with the use of a lookup table, allows for a fast recognition of various letters in accordance with the disclosure. In other exemplary search procedures, fewer or greater than three coordinate locations can be used. Furthermore, in some embodiments, the use of unique coordinate locations as described above, allows for execution of a search procedure for identifying a letter without necessarily first making a determination whether the character is a numeral or a letter.



FIG. 5 shows an exemplary list of coordinate locations in the grid-based template 300 that can be used to uniquely identify letters of the English alphabet in accordance with the disclosure. As can be understood from the list, some letters, such as F, P, R, B, E, for example, can be identified by using a combination of two or more unique coordinate locations because these letters cannot be uniquely identified by using a single coordinate location such as done for A, J, M, and W described above. Furthermore, certain letters such as K and M can be identified by using a default identification mode as no portion of any of these letters are present in the three coordinate locations. The default identification mode can be applied after completion of search for letters such as A, J, M, and W based on the three coordinate locations.



FIG. 6 shows the grid-based template 300 when used to execute optical character recognition for identifying a specific numeral in accordance with the disclosure. In a manner similar to that described above when using the grid-based template 300 to identify a specific letter, a search can be carried out in accordance with the disclosure to detect the presence of a numeral at certain unique coordinate locations and/or grid sections on the grid-based template 300.


Accordingly, in one exemplary embodiment, a search procedure is carried out by first eliminating from the search, grid sections that are known beforehand as areas in which any portion of any numeral will not be present. The search is thus confined to areas where any one of various numerals can be present. However, the search is further narrowed to first examine certain unique grid sections and/or grid coordinate locations where portions of one or more specific numerals may be present.


Accordingly, in one exemplary embodiment, the narrowed search procedure involves detecting the numeral “1” (indicated in a dashed line format) by examining four coordinate locations located in four specific grid sections that constitute a minimal group of grid sections in this case. The presence of a portion of a numeral at the coordinate location 605 (first coordinate location) provides a strong indication that the numeral can be a “1”. The identity of the numeral can be confirmed by examining three additional coordinate locations, which in this case correspond to the coordinate location 610, the coordinate location 615, and the coordinate location 620. The presence of other portions of the numeral at each additional coordinate location provides a continuously increasing level of confidence that the numeral is indeed a “1”. Thus, testing four coordinate locations at most provides a strong indication that the recognized numeral is a “1” without having to search additional areas of the grid-based template 300. Other numerals can be similarly recognized using fewer or more number of coordinate locations.


Upon completing the search procedure for the numeral “1” at the exemplary coordinate locations, the software algorithm uses a lookup table to at least make a preliminary determination of the identity of the numeral. The lookup table includes information indicating that the numeral “1” is uniquely identifiable via the four coordinates described above.


As another example, a narrowed search procedure for detecting the numeral “7” (indicated in a solid line format) can be carried out by first examining coordinate location 625. The presence of a portion of a numeral at the coordinate location 625 provides a strong indication that the numeral can be a “7”. The identity of the numeral can be confirmed by examining additional coordinate locations such as coordinate location 630. The lookup table includes information indicating that the numeral “7” is uniquely identifiable via the coordinate location 625 and/or the coordinate location 630.


Furthermore, in some embodiments, the use of unique coordinate locations as described above, allows for execution of a search procedure for identifying a numeral without necessarily first identifying whether the character is a numeral or a letter.



FIG. 7 shows a flowchart 700 of a method to execute optical character recognition in accordance with the disclosure. It is to be understood that any method steps or blocks shown in FIG. 7 represent modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the method. In certain implementations, one or more of the steps may be performed manually. It will be appreciated that, although particular example method steps are described below, additional steps or alternative steps may be utilized in various implementations without detracting from the spirit of the invention. Moreover, steps may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on various alternative implementations. Code may be also contained in one or more devices, and may not be necessarily confined to any one particular type of device.


Block 705 of the flowchart 700 pertains to using a grid-based template such as the grid-based template 300 described above, to first distinguish between the presence of one or more numerals in an image and one or more letters in the image. This action can be carried out as described above, by taking advantage of the characteristic that numerals defined in a monospaced font such as OCR-B are taller than letters that are also defined in the monospaced font. If a numeral cannot be distinguished from a letter (due to various reasons), the action indicated by block 705 assumes that the image contains numerals and the method step indicated in block 730 is executed. On the other hand, if a determination is made in block 705 that the image contains numerals and may contain letters as well, either one of the method step indicated in block 730 or the method step indicated in block 710 can be executed following execution of block 705.


Block 730 pertains to executing a numeral recognition procedure (as described above with reference to FIG. 6) using “n” coordinate locations in the grid-based template to uniquely identify one or more numerals. It should be understood that in some cases, “n” can be equal to 1, such as by examining only coordinate location 605 for detecting a portion of the numeral “1” with a certain level of confidence. The confidence level can be raised by confirming the identity of the numeral “1” by examining additional coordinate locations, such as coordinate location 610, coordinate location 615, and coordinate location 620. The improvement in confidence level is obtained at the expense of increased computation time. Consequently, the value of “n” that is selected for carrying out the action indicated in block 730 is based on a trade-off between confidence in recognition and speed of operation.


Block 735 pertains to using a look-up table to identify the detected data obtained by carrying out the action indicated in block 730. Thus, for example, the lookup table is used to identify the numeral “1” based on detecting the presence of a portion of the numeral at coordinate location 605 (and confirmed by the presence of other portions of the numeral at coordinate location 610, coordinate location 615, and/or coordinate location 620).


In block 740 a determination is made if additional numerals contained in the image are to be recognized. If yes, operation proceeds from block 740 back to block 730. If no, operation proceeds from block 740 to block 745.


Block 745 pertains to assembling information on recognized numerals obtained by executing the previous blocks (block 730, block 735, and block 740). At this point, in one exemplary implementation the action proceeds from block 745 to block 750 where the one or more recognized numerals are provided as a character recognition result. For example, the action indicated in block 750 can pertain to combining multiple recognized digits of a barcode label and providing the character recognition result to a computer for identifying an object upon which the barcode label is affixed.


However, in another exemplary implementation, when a character recognition procedure involves recognizing both letters and numerals, the action proceeds from block 745 to block 710 (as indicated by dashed line 746).


In yet another exemplary implementation, rather than proceeding from block 745 to block 710, the method step indicated in block 710 is executed following execution of block 705. Subsequent actions indicated in block 715, block 720, and block 725 for recognizing one or more letters can then be executed in parallel with actions indicated in block 730, block 735, block 740, and block 745 for recognizing one or more numerals.


Block 710 pertains to executing a letter recognition procedure (as described above with reference to FIG. 4 and FIG. 5) using “n” coordinate locations in the grid-based template to uniquely identify one or more letters. The value of “n” can be selected using similar criteria as described above with respect to block 730 for recognizing numerals. It should be therefore understood that the value of “n” selected for carrying out the action indicated in block 710 is based on a trade-off between confidence in recognition and speed of operation.


Block 715 pertains to using a look-up table to identify the detected data obtained by carrying out the action indicated in block 710. Thus, for example, the lookup table is used to identify the letter “A” based on detecting the presence of a portion of the letter at coordinate location 405.


In block 720 a determination is made if additional letters contained in the image are to be recognized. If yes, operation proceeds from block 720 back to block 710. If no, operation proceeds from block 720 to block 725.


Block 725 pertains to assembling information on recognized letters obtained by executing the previous blocks (block 710, block 715, and block 720). Action proceeds from block 725 to block 750 where the one or more recognized letters are combined with one or more recognized numerals (derived by executing actions indicated in block 745) and provided as a character recognition result.


The description above pertained to using “n” coordinate locations to identify one or more letters and/or one or more numerals with various levels of confidence. A single-step character recognition procedure in accordance with the disclosure, which will be described below in more detail, involves presetting “n” to a certain value so as to quickly and uniquely recognize any letter or numeral in a single step. The presetting can be carried out in various ways such as by using statistics to identify a suitable Hamming distance that is indicative of differences between various characters.



FIG. 8 shows an exemplary set of coordinate locations 805 (wherein “n” has been preset to an exemplary value equal to ten) and a look-up table 810 that can be used in conjunction with the set of coordinate locations 805 to uniquely identify any monospaced character using a single-step recognition procedure.


As can be understood from the set of coordinate locations 805, the first location of the ten locations corresponds to the coordinates (0,116) on a grid-based template such as the grid-based template 300 described above; the second location of the ten locations corresponds to the coordinates (0,64); the third location of the ten locations corresponds to the coordinates (0,7), and so on. The look-up table 810 includes uppercase letters (A to Z), numbers (0 to 9), and a “less than” symbol (“<”), each of which is formatted in an OCR-B font. The OCR-B font is based on a centerline drawing standard specified by the International Organization for Standardization (ISO), and the OCR-B character subset that is used in passports is defined by ICAO to include capital letters, numerals and the symbol “<” as enumerated in the look-up table 810.


A single-step recognition procedure in accordance with the disclosure involves using an OCR system (such as the OCR system 200 described above), to examine each of the ten locations identified in the of set of coordinate locations 805. Thus, for example, if a portion of a character is detected at a coordinate location (12, 90), the OCR software 206 utilizes the set of coordinate locations 805 to recognize this coordinate location as corresponding to location 8. Let it be assumed for purposes of example, that no other portion of the character is detected at any of the remaining nine locations in the set of coordinate locations 805. The OCR software 206 then utilizes the look-up table 810 to identify (via row 812) that the character to be recognized is the numeral “1”. On the other hand, if other portions of the character are detected, for example at locations 1, 2, 3, 9, and 10, the OCR software 206 utilizes the look-up table 810 to identify (via row 811) that the character is the letter “Z” and not the numeral “1”.


It will also be pertinent to point out that unlike the character recognition procedures described above with respect to FIG. 4 and FIG. 6 that distinguish between a numeral and a letter based on the centerline height difference between numerals and letters of a monospaced font, the single-step recognition procedure does not require examination of the centerline height difference between two or more characters. However, utilizing the single-step recognition procedure provides a savings in time in comparison to many traditional character recognition procedures and this savings in time can be optionally used to execute additional procedures such as using the grid-based template and/or applying statistics to confirm a character recognition result obtained via the single-step recognition procedure. Thus, for example, a grid-based template (such as the grid-based template 300) can be utilized to execute a character recognition procedure for confirming the identity of a character recognized by utilizing the single-step recognition procedure.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:


U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;


U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;


U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;


U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;


U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;


U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;


U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;


U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;


U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;


U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;


U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;


U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;


U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;


U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;


U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;


U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;


U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;


U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;


U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;


U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;


U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;


U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;


U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;


U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;


U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;


U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;


U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;


U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;


U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;


U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;


U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;


U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;


U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;


U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;


U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;


U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;


U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;


U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;


U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;


U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;


U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;


U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;


U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;


U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;


U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;


U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;


U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;


U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;


U.S. Pat. No. 8,727,223; U.S. Pat. No. 8,740,082;


U.S. Pat. No. 8,740,085; U.S. Pat. No. 8,746,563;


U.S. Pat. No. 8,750,445; U.S. Pat. No. 8,752,766;


U.S. Pat. No. 8,756,059; U.S. Pat. No. 8,757,495;


U.S. Pat. No. 8,760,563; U.S. Pat. No. 8,763,909;


U.S. Pat. No. 8,777,108; U.S. Pat. No. 8,777,109;


U.S. Pat. No. 8,779,898; U.S. Pat. No. 8,781,520;


U.S. Pat. No. 8,783,573; U.S. Pat. No. 8,789,757;


U.S. Pat. No. 8,789,758; U.S. Pat. No. 8,789,759;


U.S. Pat. No. 8,794,520; U.S. Pat. No. 8,794,522;


U.S. Pat. No. 8,794,525; U.S. Pat. No. 8,794,526;


U.S. Pat. No. 8,798,367; U.S. Pat. No. 8,807,431;


U.S. Pat. No. 8,807,432; U.S. Pat. No. 8,820,630;


U.S. Pat. No. 8,822,848; U.S. Pat. No. 8,824,692;


U.S. Pat. No. 8,824,696; U.S. Pat. No. 8,842,849;


U.S. Pat. No. 8,844,822; U.S. Pat. No. 8,844,823;


U.S. Pat. No. 8,849,019; U.S. Pat. No. 8,851,383;


U.S. Pat. No. 8,854,633; U.S. Pat. No. 8,866,963;


U.S. Pat. No. 8,868,421; U.S. Pat. No. 8,868,519;


U.S. Pat. No. 8,868,802; U.S. Pat. No. 8,868,803;


U.S. Pat. No. 8,870,074; U.S. Pat. No. 8,879,639;


U.S. Pat. No. 8,880,426; U.S. Pat. No. 8,881,983;


U.S. Pat. No. 8,881,987; U.S. Pat. No. 8,903,172;


U.S. Pat. No. 8,908,995; U.S. Pat. No. 8,910,870;


U.S. Pat. No. 8,910,875; U.S. Pat. No. 8,914,290;


U.S. Pat. No. 8,914,788; U.S. Pat. No. 8,915,439;


U.S. Pat. No. 8,915,444; U.S. Pat. No. 8,916,789;


U.S. Pat. No. 8,918,250; U.S. Pat. No. 8,918,564;


U.S. Pat. No. 8,925,818; U.S. Pat. No. 8,939,374;


U.S. Pat. No. 8,942,480; U.S. Pat. No. 8,944,313;


U.S. Pat. No. 8,944,327; U.S. Pat. No. 8,944,332;


U.S. Pat. No. 8,950,678; U.S. Pat. No. 8,967,468;


U.S. Pat. No. 8,971,346; U.S. Pat. No. 8,976,030;


U.S. Pat. No. 8,976,368; U.S. Pat. No. 8,978,981;


U.S. Pat. No. 8,978,983; U.S. Pat. No. 8,978,984;


U.S. Pat. No. 8,985,456; U.S. Pat. No. 8,985,457;


U.S. Pat. No. 8,985,459; U.S. Pat. No. 8,985,461;


U.S. Pat. No. 8,988,578; U.S. Pat. No. 8,988,590;


U.S. Pat. No. 8,991,704; U.S. Pat. No. 8,996,194;


U.S. Pat. No. 8,996,384; U.S. Pat. No. 9,002,641;


U.S. Pat. No. 9,007,368; U.S. Pat. No. 9,010,641;


U.S. Pat. No. 9,015,513; U.S. Pat. No. 9,016,576;


U.S. Pat. No. 9,022,288; U.S. Pat. No. 9,030,964;


U.S. Pat. No. 9,033,240; U.S. Pat. No. 9,033,242;


U.S. Pat. No. 9,036,054; U.S. Pat. No. 9,037,344;


U.S. Pat. No. 9,038,911; U.S. Pat. No. 9,038,915;


U.S. Pat. No. 9,047,098; U.S. Pat. No. 9,047,359;


U.S. Pat. No. 9,047,420; U.S. Pat. No. 9,047,525;


U.S. Pat. No. 9,047,531; U.S. Pat. No. 9,053,055;


U.S. Pat. No. 9,053,378; U.S. Pat. No. 9,053,380;


U.S. Pat. No. 9,058,526; U.S. Pat. No. 9,064,165;


U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,167;


U.S. Pat. No. 9,064,168; U.S. Pat. No. 9,064,254;


U.S. Pat. No. 9,066,032; U.S. Pat. No. 9,070,032;


U.S. Pat. No. 9,076,459; U.S. Pat. No. 9,079,423;


U.S. Pat. No. 9,080,856; U.S. Pat. No. 9,082,023;


U.S. Pat. No. 9,082,031; U.S. Pat. No. 9,084,032;


U.S. Pat. No. 9,087,250; U.S. Pat. No. 9,092,681;


U.S. Pat. No. 9,092,682; U.S. Pat. No. 9,092,683;


U.S. Pat. No. 9,093,141; U.S. Pat. No. 9,098,763;


U.S. Pat. No. 9,104,929; U.S. Pat. No. 9,104,934;


U.S. Pat. No. 9,107,484; U.S. Pat. No. 9,111,159;


U.S. Pat. No. 9,111,166; U.S. Pat. No. 9,135,483;


U.S. Pat. No. 9,137,009; U.S. Pat. No. 9,141,839;


U.S. Pat. No. 9,147,096; U.S. Pat. No. 9,148,474;


U.S. Pat. No. 9,158,000; U.S. Pat. No. 9,158,340;


U.S. Pat. No. 9,158,953; U.S. Pat. No. 9,159,059;


U.S. Pat. No. 9,165,174; U.S. Pat. No. 9,171,543;


U.S. Pat. No. 9,183,425; U.S. Pat. No. 9,189,669;


U.S. Pat. No. 9,195,844; U.S. Pat. No. 9,202,458;


U.S. Pat. No. 9,208,366; U.S. Pat. No. 9,208,367;


U.S. Pat. No. 9,219,836; U.S. Pat. No. 9,224,024;


U.S. Pat. No. 9,224,027; U.S. Pat. No. 9,230,140;


U.S. Pat. No. 9,235,553; U.S. Pat. No. 9,239,950;


U.S. Pat. No. 9,245,492; U.S. Pat. No. 9,248,640;


U.S. Pat. No. 9,250,652; U.S. Pat. No. 9,250,712;


U.S. Pat. No. 9,251,411; U.S. Pat. No. 9,258,033;


U.S. Pat. No. 9,262,633; U.S. Pat. No. 9,262,660;


U.S. Pat. No. 9,262,662; U.S. Pat. No. 9,269,036;


U.S. Pat. No. 9,270,782; U.S. Pat. No. 9,274,812;


U.S. Pat. No. 9,275,388; U.S. Pat. No. 9,277,668;


U.S. Pat. No. 9,280,693; U.S. Pat. No. 9,286,496;


U.S. Pat. No. 9,298,964; U.S. Pat. No. 9,301,427;


U.S. Pat. No. 9,313,377; U.S. Pat. No. 9,317,037;


U.S. Pat. No. 9,319,548; U.S. Pat. No. 9,342,723;


U.S. Pat. No. 9,361,882; U.S. Pat. No. 9,365,381;


U.S. Pat. No. 9,373,018; U.S. Pat. No. 9,375,945;


U.S. Pat. No. 9,378,403; U.S. Pat. No. 9,383,848;


U.S. Pat. No. 9,384,374; U.S. Pat. No. 9,390,304;


U.S. Pat. No. 9,390,596; U.S. Pat. No. 9,411,386;


U.S. Pat. No. 9,412,242; U.S. Pat. No. 9,418,269;


U.S. Pat. No. 9,418,270; U.S. Pat. No. 9,465,967;


U.S. Pat. No. 9,423,318; U.S. Pat. No. 9,424,454;


U.S. Pat. No. 9,436,860; U.S. Pat. No. 9,443,123;


U.S. Pat. No. 9,443,222; U.S. Pat. No. 9,454,689;


U.S. Pat. No. 9,464,885; U.S. Pat. No. 9,465,967;


U.S. Pat. No. 9,478,983; U.S. Pat. No. 9,481,186;


U.S. Pat. No. 9,487,113; U.S. Pat. No. 9,488,986;


U.S. Pat. No. 9,489,782; U.S. Pat. No. 9,490,540;


U.S. Pat. No. 9,491,729; U.S. Pat. No. 9,497,092;


U.S. Pat. No. 9,507,974; U.S. Pat. No. 9,519,814;


U.S. Pat. No. 9,521,331; U.S. Pat. No. 9,530,038;


U.S. Pat. No. 9,572,901; U.S. Pat. No. 9,558,386;


U.S. Pat. No. 9,606,581; U.S. Pat. No. 9,646,189;


U.S. Pat. No. 9,646,191; U.S. Pat. No. 9,652,648;


U.S. Pat. No. 9,652,653; U.S. Pat. No. 9,656,487;


U.S. Pat. No. 9,659,198; U.S. Pat. No. 9,680,282;


U.S. Pat. No. 9,697,401; U.S. Pat. No. 9,701,140;


U.S. Design Pat. No. D702,237;


U.S. Design Pat. No. D716,285;


U.S. Design Pat. No. D723,560;


U.S. Design Pat. No. D730,357;


U.S. Design Pat. No. D730,901;


U.S. Design Pat. No. D730,902;


U.S. Design Pat. No. D734,339;


U.S. Design Pat. No. D737,321;


U.S. Design Pat. No. D754,205;


U.S. Design Pat. No. D754,206;


U.S. Design Pat. No. D757,009;


U.S. Design Pat. No. D760,719;


U.S. Design Pat. No. D762,604;


U.S. Design Pat. No. D766,244;


U.S. Design Pat. No. D777,166;


U.S. Design Pat. No. D771,631;


U.S. Design Pat. No. D783,601;


U.S. Design Pat. No. D785,617;


U.S. Design Pat. No. D785,636;


U.S. Design Pat. No. D790,505;


U.S. Design Pat. No. D790,546;


International Publication No. 2013/163789;
U.S. Patent Application Publication No. 2008/0185432;
U.S. Patent Application Publication No. 2009/0134221;
U.S. Patent Application Publication No. 2010/0177080;
U.S. Patent Application Publication No. 2010/0177076;
U.S. Patent Application Publication No. 2010/0177707;
U.S. Patent Application Publication No. 2010/0177749;
U.S. Patent Application Publication No. 2010/0265880;
U.S. Patent Application Publication No. 2011/0202554;
U.S. Patent Application Publication No. 2012/0111946;
U.S. Patent Application Publication No. 2012/0168511;
U.S. Patent Application Publication No. 2012/0168512;
U.S. Patent Application Publication No. 2012/0193423;
U.S. Patent Application Publication No. 2012/0194692;
U.S. Patent Application Publication No. 2012/0203647;
U.S. Patent Application Publication No. 2012/0223141;
U.S. Patent Application Publication No. 2012/0228382;
U.S. Patent Application Publication No. 2012/0248188;
U.S. Patent Application Publication No. 2013/0043312;
U.S. Patent Application Publication No. 2013/0082104;
U.S. Patent Application Publication No. 2013/0175341;
U.S. Patent Application Publication No. 2013/0175343;
U.S. Patent Application Publication No. 2013/0257744;
U.S. Patent Application Publication No. 2013/0257759;
U.S. Patent Application Publication No. 2013/0270346;
U.S. Patent Application Publication No. 2013/0292475;
U.S. Patent Application Publication No. 2013/0292477;
U.S. Patent Application Publication No. 2013/0293539;
U.S. Patent Application Publication No. 2013/0293540;
U.S. Patent Application Publication No. 2013/0306728;
U.S. Patent Application Publication No. 2013/0306731;
U.S. Patent Application Publication No. 2013/0307964;
U.S. Patent Application Publication No. 2013/0308625;
U.S. Patent Application Publication No. 2013/0313324;
U.S. Patent Application Publication No. 2013/0332996;
U.S. Patent Application Publication No. 2014/0001267;
U.S. Patent Application Publication No. 2014/0025584;
U.S. Patent Application Publication No. 2014/0034734;
U.S. Patent Application Publication No. 2014/0036848;
U.S. Patent Application Publication No. 2014/0039693;
U.S. Patent Application Publication No. 2014/0049120;
U.S. Patent Application Publication No. 2014/0049635;
U.S. Patent Application Publication No. 2014/0061306;
U.S. Patent Application Publication No. 2014/0063289;
U.S. Patent Application Publication No. 2014/0066136;
U.S. Patent Application Publication No. 2014/0067692;
U.S. Patent Application Publication No. 2014/0070005;
U.S. Patent Application Publication No. 2014/0071840;
U.S. Patent Application Publication No. 2014/0074746;
U.S. Patent Application Publication No. 2014/0076974;
U.S. Patent Application Publication No. 2014/0097249;
U.S. Patent Application Publication No. 2014/0098792;
U.S. Patent Application Publication No. 2014/0100813;
U.S. Patent Application Publication No. 2014/0103115;
U.S. Patent Application Publication No. 2014/0104413;
U.S. Patent Application Publication No. 2014/0104414;
U.S. Patent Application Publication No. 2014/0104416;
U.S. Patent Application Publication No. 2014/0106725;
U.S. Patent Application Publication No. 2014/0108010;
U.S. Patent Application Publication No. 2014/0108402;
U.S. Patent Application Publication No. 2014/0110485;
U.S. Patent Application Publication No. 2014/0125853;
U.S. Patent Application Publication No. 2014/0125999;
U.S. Patent Application Publication No. 2014/0129378;
U.S. Patent Application Publication No. 2014/0131443;
U.S. Patent Application Publication No. 2014/0133379;
U.S. Patent Application Publication No. 2014/0136208;
U.S. Patent Application Publication No. 2014/0140585;
U.S. Patent Application Publication No. 2014/0152882;
U.S. Patent Application Publication No. 2014/0158770;
U.S. Patent Application Publication No. 2014/0159869;
U.S. Patent Application Publication No. 2014/0166759;
U.S. Patent Application Publication No. 2014/0168787;
U.S. Patent Application Publication No. 2014/0175165;
U.S. Patent Application Publication No. 2014/0191684;
U.S. Patent Application Publication No. 2014/0191913;
U.S. Patent Application Publication No. 2014/0197304;
U.S. Patent Application Publication No. 2014/0214631;
U.S. Patent Application Publication No. 2014/0217166;
U.S. Patent Application Publication No. 2014/0231500;
U.S. Patent Application Publication No. 2014/0247315;
U.S. Patent Application Publication No. 2014/0263493;
U.S. Patent Application Publication No. 2014/0263645;
U.S. Patent Application Publication No. 2014/0270196;
U.S. Patent Application Publication No. 2014/0270229;
U.S. Patent Application Publication No. 2014/0278387;
U.S. Patent Application Publication No. 2014/0288933;
U.S. Patent Application Publication No. 2014/0297058;
U.S. Patent Application Publication No. 2014/0299665;
U.S. Patent Application Publication No. 2014/0332590;
U.S. Patent Application Publication No. 2014/0351317;
U.S. Patent Application Publication No. 2014/0362184;
U.S. Patent Application Publication No. 2014/0363015;
U.S. Patent Application Publication No. 2014/0369511;
U.S. Patent Application Publication No. 2014/0374483;
U.S. Patent Application Publication No. 2014/0374485;
U.S. Patent Application Publication No. 2015/0001301;
U.S. Patent Application Publication No. 2015/0001304;
U.S. Patent Application Publication No. 2015/0009338;
U.S. Patent Application Publication No. 2015/0014416;
U.S. Patent Application Publication No. 2015/0021397;
U.S. Patent Application Publication No. 2015/0028104;
U.S. Patent Application Publication No. 2015/0029002;
U.S. Patent Application Publication No. 2015/0032709;
U.S. Patent Application Publication No. 2015/0039309;
U.S. Patent Application Publication No. 2015/0039878;
U.S. Patent Application Publication No. 2015/0040378;
U.S. Patent Application Publication No. 2015/0049347;
U.S. Patent Application Publication No. 2015/0051992;
U.S. Patent Application Publication No. 2015/0053769;
U.S. Patent Application Publication No. 2015/0062366;
U.S. Patent Application Publication No. 2015/0063215;
U.S. Patent Application Publication No. 2015/0088522;
U.S. Patent Application Publication No. 2015/0096872;
U.S. Patent Application Publication No. 2015/0100196;
U.S. Patent Application Publication No. 2015/0102109;
U.S. Patent Application Publication No. 2015/0115035;
U.S. Patent Application Publication No. 2015/0127791;
U.S. Patent Application Publication No. 2015/0128116;
U.S. Patent Application Publication No. 2015/0133047;
U.S. Patent Application Publication No. 2015/0134470;
U.S. Patent Application Publication No. 2015/0136851;
U.S. Patent Application Publication No. 2015/0142492;
U.S. Patent Application Publication No. 2015/0144692;
U.S. Patent Application Publication No. 2015/0144698;
U.S. Patent Application Publication No. 2015/0149946;
U.S. Patent Application Publication No. 2015/0161429;
U.S. Patent Application Publication No. 2015/0178523;
U.S. Patent Application Publication No. 2015/0178537;
U.S. Patent Application Publication No. 2015/0178685;
U.S. Patent Application Publication No. 2015/0181109;
U.S. Patent Application Publication No. 2015/0199957;
U.S. Patent Application Publication No. 2015/0210199;
U.S. Patent Application Publication No. 2015/0212565;
U.S. Patent Application Publication No. 2015/0213647;
U.S. Patent Application Publication No. 2015/0220753;
U.S. Patent Application Publication No. 2015/0220901;
U.S. Patent Application Publication No. 2015/0227189;
U.S. Patent Application Publication No. 2015/0236984;
U.S. Patent Application Publication No. 2015/0239348;
U.S. Patent Application Publication No. 2015/0242658;
U.S. Patent Application Publication No. 2015/0248572;
U.S. Patent Application Publication No. 2015/0254485;
U.S. Patent Application Publication No. 2015/0261643;
U.S. Patent Application Publication No. 2015/0264624;
U.S. Patent Application Publication No. 2015/0268971;
U.S. Patent Application Publication No. 2015/0269402;
U.S. Patent Application Publication No. 2015/0288689;
U.S. Patent Application Publication No. 2015/0288896;
U.S. Patent Application Publication No. 2015/0310243;
U.S. Patent Application Publication No. 2015/0310244;
U.S. Patent Application Publication No. 2015/0310389;
U.S. Patent Application Publication No. 2015/0312780;
U.S. Patent Application Publication No. 2015/0327012;
U.S. Patent Application Publication No. 2016/0014251;
U.S. Patent Application Publication No. 2016/0025697;
U.S. Patent Application Publication No. 2016/0026838;
U.S. Patent Application Publication No. 2016/0026839;
U.S. Patent Application Publication No. 2016/0040982;
U.S. Patent Application Publication No. 2016/0042241;
U.S. Patent Application Publication No. 2016/0057230;
U.S. Patent Application Publication No. 2016/0062473;
U.S. Patent Application Publication No. 2016/0070944;
U.S. Patent Application Publication No. 2016/0092805;
U.S. Patent Application Publication No. 2016/0101936;
U.S. Patent Application Publication No. 2016/0104019;
U.S. Patent Application Publication No. 2016/0104274;
U.S. Patent Application Publication No. 2016/0109219;
U.S. Patent Application Publication No. 2016/0109220;
U.S. Patent Application Publication No. 2016/0109224;
U.S. Patent Application Publication No. 2016/0112631;
U.S. Patent Application Publication No. 2016/0112643;
U.S. Patent Application Publication No. 2016/0117627;
U.S. Patent Application Publication No. 2016/0124516;
U.S. Patent Application Publication No. 2016/0125217;
U.S. Patent Application Publication No. 2016/0125342;
U.S. Patent Application Publication No. 2016/0125873;
U.S. Patent Application Publication No. 2016/0133253;
U.S. Patent Application Publication No. 2016/0171597;
U.S. Patent Application Publication No. 2016/0171666;
U.S. Patent Application Publication No. 2016/0171720;
U.S. Patent Application Publication No. 2016/0171775;
U.S. Patent Application Publication No. 2016/0171777;
U.S. Patent Application Publication No. 2016/0174674;
U.S. Patent Application Publication No. 2016/0178479;
U.S. Patent Application Publication No. 2016/0178685;
U.S. Patent Application Publication No. 2016/0178707;
U.S. Patent Application Publication No. 2016/0179132;
U.S. Patent Application Publication No. 2016/0179143;
U.S. Patent Application Publication No. 2016/0179368;
U.S. Patent Application Publication No. 2016/0179378;
U.S. Patent Application Publication No. 2016/0180130;
U.S. Patent Application Publication No. 2016/0180133;
U.S. Patent Application Publication No. 2016/0180136;
U.S. Patent Application Publication No. 2016/0180594;
U.S. Patent Application Publication No. 2016/0180663;
U.S. Patent Application Publication No. 2016/0180678;
U.S. Patent Application Publication No. 2016/0180713;
U.S. Patent Application Publication No. 2016/0185136;
U.S. Patent Application Publication No. 2016/0185291;
U.S. Patent Application Publication No. 2016/0186926;
U.S. Patent Application Publication No. 2016/0188861;
U.S. Patent Application Publication No. 2016/0188939;
U.S. Patent Application Publication No. 2016/0188940;
U.S. Patent Application Publication No. 2016/0188941;
U.S. Patent Application Publication No. 2016/0188942;
U.S. Patent Application Publication No. 2016/0188943;
U.S. Patent Application Publication No. 2016/0188944;
U.S. Patent Application Publication No. 2016/0189076;
U.S. Patent Application Publication No. 2016/0189087;
U.S. Patent Application Publication No. 2016/0189088;
U.S. Patent Application Publication No. 2016/0189092;
U.S. Patent Application Publication No. 2016/0189284;
U.S. Patent Application Publication No. 2016/0189288;
U.S. Patent Application Publication No. 2016/0189366;
U.S. Patent Application Publication No. 2016/0189443;
U.S. Patent Application Publication No. 2016/0189447;
U.S. Patent Application Publication No. 2016/0189489;
U.S. Patent Application Publication No. 2016/0192051;
U.S. Patent Application Publication No. 2016/0202951;
U.S. Patent Application Publication No. 2016/0202958;
U.S. Patent Application Publication No. 2016/0202959;
U.S. Patent Application Publication No. 2016/0203021;
U.S. Patent Application Publication No. 2016/0203429;
U.S. Patent Application Publication No. 2016/0203797;
U.S. Patent Application Publication No. 2016/0203820;
U.S. Patent Application Publication No. 2016/0204623;
U.S. Patent Application Publication No. 2016/0204636;
U.S. Patent Application Publication No. 2016/0204638;
U.S. Patent Application Publication No. 2016/0227912;
U.S. Patent Application Publication No. 2016/0232891;
U.S. Patent Application Publication No. 2016/0292477;
U.S. Patent Application Publication No. 2016/0294779;
U.S. Patent Application Publication No. 2016/0306769;
U.S. Patent Application Publication No. 2016/0314276;
U.S. Patent Application Publication No. 2016/0314294;
U.S. Patent Application Publication No. 2016/0316190;
U.S. Patent Application Publication No. 2016/0323310;
U.S. Patent Application Publication No. 2016/0325677;
U.S. Patent Application Publication No. 2016/0327614;
U.S. Patent Application Publication No. 2016/0327930;
U.S. Patent Application Publication No. 2016/0328762;
U.S. Patent Application Publication No. 2016/0330218;
U.S. Patent Application Publication No. 2016/0343163;
U.S. Patent Application Publication No. 2016/0343176;
U.S. Patent Application Publication No. 2016/0364914;
U.S. Patent Application Publication No. 2016/0370220;
U.S. Patent Application Publication No. 2016/0372282;
U.S. Patent Application Publication No. 2016/0373847;
U.S. Patent Application Publication No. 2016/0377414;
U.S. Patent Application Publication No. 2016/0377417;
U.S. Patent Application Publication No. 2017/0010141;
U.S. Patent Application Publication No. 2017/0010328;
U.S. Patent Application Publication No. 2017/0010780;
U.S. Patent Application Publication No. 2017/0016714;
U.S. Patent Application Publication No. 2017/0018094;
U.S. Patent Application Publication No. 2017/0046603;
U.S. Patent Application Publication No. 2017/0047864;
U.S. Patent Application Publication No. 2017/0053146;
U.S. Patent Application Publication No. 2017/0053147;
U.S. Patent Application Publication No. 2017/0053647;
U.S. Patent Application Publication No. 2017/0055606;
U.S. Patent Application Publication No. 2017/0060316;
U.S. Patent Application Publication No. 2017/0061961;
U.S. Patent Application Publication No. 2017/0064634;
U.S. Patent Application Publication No. 2017/0083730;
U.S. Patent Application Publication No. 2017/0091502;
U.S. Patent Application Publication No. 2017/0091706;
U.S. Patent Application Publication No. 2017/0091741;
U.S. Patent Application Publication No. 2017/0091904;
U.S. Patent Application Publication No. 2017/0092908;
U.S. Patent Application Publication No. 2017/0094238;
U.S. Patent Application Publication No. 2017/0098947;
U.S. Patent Application Publication No. 2017/0100949;
U.S. Patent Application Publication No. 2017/0108838;
U.S. Patent Application Publication No. 2017/0108895;
U.S. Patent Application Publication No. 2017/0118355;
U.S. Patent Application Publication No. 2017/0123598;
U.S. Patent Application Publication No. 2017/0124369;
U.S. Patent Application Publication No. 2017/0124396;
U.S. Patent Application Publication No. 2017/0124687;
U.S. Patent Application Publication No. 2017/0126873;
U.S. Patent Application Publication No. 2017/0126904;
U.S. Patent Application Publication No. 2017/0139012;
U.S. Patent Application Publication No. 2017/0140329;
U.S. Patent Application Publication No. 2017/0140731;
U.S. Patent Application Publication No. 2017/0147847;
U.S. Patent Application Publication No. 2017/0150124;
U.S. Patent Application Publication No. 2017/0169198;
U.S. Patent Application Publication No. 2017/0171035;
U.S. Patent Application Publication No. 2017/0171703;
U.S. Patent Application Publication No. 2017/0171803;
U.S. Patent Application Publication No. 2017/0180359;
U.S. Patent Application Publication No. 2017/0180577;
U.S. Patent Application Publication No. 2017/0181299;
U.S. Patent Application Publication No. 2017/0190192;
U.S. Patent Application Publication No. 2017/0193432;
U.S. Patent Application Publication No. 2017/0193461;
U.S. Patent Application Publication No. 2017/0193727;
U.S. Patent Application Publication No. 2017/0199266;
U.S. Patent Application Publication No. 2017/0200108; and
U.S. Patent Application Publication No. 2017/0200275.

In the specification and/or figures, exemplary embodiments of the invention have been disclosed. The present disclosure is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method comprising: using an optical character recognition system to execute an optical character recognition procedure, the optical character recognition procedure comprising: applying a grid-based template to a character having a monospaced font;defining in the grid-based template, a first grid section that includes a first portion of the character when the character has a first size and excludes the first portion of the character when the character has a second size that is smaller than the first size; andrecognizing the character as a numeral when the first grid section includes the first portion of the character.
  • 2. The method of claim 1, wherein the grid-based template has a rectangular shape,the second size is indicative of a letter of an alphabet, andthe first grid section is selected to provide an indication of a difference in centerline height between the numeral and the letter of the alphabet.
  • 3. The method of claim 2, wherein the first grid section is a portion of a grid pattern that uses numerical coordinates of an x-y mapping system.
  • 4. The method of claim 1, wherein the monospaced font is an OCR-B font.
  • 5. The method of claim 1, wherein the optical character recognition procedure further comprises: defining in the grid-based template, a second grid section that houses a second portion of the character when the character has one of the first size or the second size; andrecognizing the character as a letter of an alphabet when the first grid section excludes the first portion of the character and the second grid section includes the second portion of the character.
  • 6. The method of claim 1, wherein the optical character recognition procedure further comprises: examining a group of three locations in a central area of the grid-based template; anddetecting an identity of one of a subset of letters of an alphabet based at least in part on examining the group of three locations.
  • 7. The method of claim 1, wherein the optical character recognition procedure further comprises: examining a group of three locations located on a vertical axis of the grid-based template; andusing a lookup table to interpret a result of examining the group of three locations located on the vertical axis; anddetecting an identity of one of a subset of letters of an alphabet based on using the lookup table.
  • 8. The method of claim 1, wherein the optical character recognition procedure further comprises: defining in the grid-based template, a second grid section that excludes the first portion of the character when the character has the first size; andrecognizing the numeral as belonging to a first subset in a set of numerals when the first grid section includes the first portion of the character and the second grid section excludes the first portion of the character.
  • 9. The method of claim 1, wherein the optical character recognition procedure further comprises: defining in the grid-based template, a minimal group of grid sections that includes a second portion of the character when the character has the first size; anduniquely identifying the numeral among a first subset in a set of numerals when the minimal group of grid sections includes the second portion of the character.
  • 10. A method comprising: providing to an optical character recognition system, a barcode label containing a plurality of digits; andusing the optical character recognition system to execute an optical character recognition procedure, the optical character recognition procedure comprising: applying a bounding box to an individual digit among the plurality of digits contained in the barcode label;applying a grid-based template to the bounding box, the grid-based template comprising a plurality of grid sections; andusing the plurality of grid sections to identify the individual digit contained in the bounding box.
  • 11. The method of claim 10, wherein the barcode label conforms to at least one standard that is characterized by an European Article Number (EAN), and wherein each digit in the plurality of digits has a monospaced font.
  • 12. The method of claim 10, wherein the optical character recognition procedure further comprises: designating a first grid section of the grid-based template as a first unique identification area in the grid-based template, the first unique identification area selected to contain a portion of the individual digit and exclude all portions of all other digits in the plurality of digits when the individual digit is present in the bounding box,seeking a positive match in the first grid section of the grid-based template to detect the portion of the individual digit, anddetecting an identity of the individual digit based on obtaining the positive match.
  • 13. The method of claim 12, wherein the optical character recognition procedure further comprises: designating a second grid section of the grid-based template as a second unique identification area in the grid-based template, the second unique identification area selected to exclude all portions of all other digits in the plurality of digits when the individual digit is present in the bounding box; andconfirming the identity of the individual digit by seeking a second match in the second grid section of the grid-based template to detect another portion of the individual digit.
  • 14. The method of claim 12, wherein the optical character recognition procedure further comprises: designating a second grid section of the grid-based template as a second unique identification area in the grid-based template, the second unique identification area selected to exclude all portions of all digits in the plurality of digits.
  • 15. The method of claim 14, wherein each of a set of four grid sections located at four corners of the grid-based template constitutes the second unique identification area.
  • 16. A method comprising: using an optical character recognition system to execute an optical character recognition procedure, the optical character recognition procedure comprising: applying a bounding box to a character;applying a grid-based template to the bounding box;defining a portion of the grid-based template as a primary search area; andusing at least the primary search area to identify the character contained in the bounding box.
  • 17. The method of claim 16, wherein the character has a monospaced font and comprises any one of a numeral, a letter of an alphabet, or a mathematical symbol; and wherein the grid-based template has a grid pattern that is characterized using numerical coordinates of an x-y mapping system.
  • 18. The method of claim 17, wherein the optical character recognition procedure further comprises: defining a set of locations inside the primary search area, each set of the set of locations identifiable by coordinates of the x-y mapping system; anddetecting in the set of locations, a portion of the character, the portion of the character uniquely identifying the character from among a set of characters.
  • 19. The method of claim 18, wherein the set of locations is equal to “n” locations (n≥2).
  • 20. The method of claim 17, wherein using at least the primary search area comprises defining the primary search area as one of an oval or a circle having a diameter that is substantially equal to a width of the monospaced font.