The hardware can comprise at least one communications/output unit 105, at least one display unit 110, at least one centralized processing unit (CPU) 115, at least one hard disk unit 120, at least one memory unit 125, and at least one input unit 130. The communications/output unit 105 can send results of extraction processing to, for example, a screen, printer, disk, computer and/or application. The display unit 110 can display information. The CPU 115 can interpret and execute instructions from the hardware and/or software components. The hard disk unit 120 can receive information (e.g., documents, data) from CPU 115, memory unit 125, and/or input 130. The memory unit 125 can store information. The input unit 130 can receive information (e.g., at least one document image or other data) for processing from, for example, a screen, scanner, disk, computer, application, keyboard, mouse, or other human or non-human input device, or any combination thereof.
The software can comprise one or more databases 145, at least one OCR boosting module 150, at least one image processing module 155, at least one OCR module 160, at least one document input module 165, at least one document conversion module 170, at least one text processing statistical analysis module 175, at least one document/output post processing module 180, and at least one systems administration module 185. The database 145 can store information. The image processing module 155 can include software which can process images. The OCR module 160 can include software which can generate a textual representation of the image scanned in by the input unit 130 (using, for example, a scanner). It should be noted that multiple OCR modules 160 can be utilized, in one embodiment. In addition, different parameter sets and different image preprocessing can be utilized. For example, parameter sets that can be utilized for different OCR modules can comprise, but are not limited to: certain dictionaries, applicable languages, character subsets to be recognized (e.g., all digits or all characters). Image preprocessing can include, but is not limited to: rotation correction, noise removal, edge enhancement filters (e.g., enhancing the edge contrast of an image to make the edge look more defined), color space modifications (e.g., translating the representation of a color from one reference frame to another to make the translated image look more similar to the original image), and any combination thereof. The document input module 165 can include software which can work with preprocessed documents (e.g., preprocessed in system 100 or elsewhere) to obtain information (e.g., used for training). For example, if documents are available that were already OCRed, the information from these documents (e.g., imagelets and characters) can be used in the OCR booster training phase to create OCR booster sets. Document representations (e.g., images and/or OCR text) can be sent to the OCR boosting module 150, which can perform learning, extraction and validation For example, as explained in more detail below with respect to
The OCR boosting module 150 can perform learning, extraction and validation (discussed further with respect to
Referring to
Note that, in one embodiment, one document, one page, parts of pages (e.g., zones, paragraphs, lines, or words), multiple pages, or multiple documents, or any combination thereof can be input into the OCR boosting module 150. Because method 200 can be based on small document parts, rather than full documents or multiple documents, generalization (e.g., due to multiple fonts, multiple font scales, multiple font properties (e.g., bold, italic)), done in order to identify the character(s) of interest, can be minimized because there is often less variance when a smaller set rather than a larger set (e.g., a line or paragraph versus multiple pages) is input into the OCR boosting module 150. Additionally, in one embodiment, method 200 can be utilized for each subset of characters. Thus, the potential confusion between characters within an OCR learn set can be restricted to such character subsets. Examples for such subsets are digits, punctuation marks, small alphabetical characters, capital alphabetical characters, etc. It should also be noted that an OCR learn set can include manually input examples. For example, if a certain set of invoices always have the same OCR error, an operator may want to put in the correct character and add it to the OCR learn set so that future documents will have a higher likelihood of being correct.
In 210, at least one OCR seed is created by compiling the imagelets (i.e., image part or character imagelet) corresponding to the characters obtained by the OCR module 160 from the image of the document. The imagelets can be obtained by extracting each character imagelet from the document image. A character segmentation algorithm can be used to generate the character imagelet based on the character imagelet's coordinates (e.g., represented by a bounding box) in the document image. For example character segmentation algorithm options, see Casey, R. G. et al., A Survey of Methods and Strategies in Character Segmentation, IEEE Trans. Pattern Anal. Mach. Intell., Vol. 18, No. 7 (July 1996), 690-706.
For example, referring to
It should be noted that character imagelets are often not all the same, due to OCR errors. However, as OCR modules 160 can return a large number (e.g., more than 50%) of correct characters, the correct class can dominate the dataset.
Referring again to
315 of
It should be noted that the OCR learn set can also be used to train classifiers (e.g., support vector machines, neural networks) directly. The imagelets and the respective OCR initial results can be used as input for such trainable classifiers and the training can be performed according to certain algorithms. Such algorithms can be, but are not limited to Support Vector Machines Neural Networks, Bayes classifiers, Decision trees and bootstrapping methods. The actual OCR boosting (i.e., obtaining a second opinion on given OCR results for given imagelets) can be performed by applying the pre-trained classifiers (this process can be referred to as a classification phase). Classifiers can be trained based on filtered, unfiltered, preprocessed, or raw imagelet sets.
As noted above,
In 410, the OCR learn set learned in 220 is input. As noted above, the OCR learn set can contain an average and variance for each imagelet of interest. For the example of
In 415, each imagelet of interest from the new document is compared to the OCR learn set. In the example of 515 in
Referring back to
In 425, the OCR booster results for a particular imagelet are found by selecting the closest match between the image being analyzed and the images in the learn set. Thus, each character imagelet from the original OCR scanned image can be reclassified using the OCR learn set. This can help solve the inconsistencies and boost the OCR correction rate. In 425, a confidence rating can also be provided. In 525 of
As noted above, a confidence rating can be calculated for OCR booster results for each imagelet of interest by comparing the statistics of the OCR booster learn set for each learned character with the actual imagelet. Confidence values can also be obtained by application of trained classifiers (e.g., Support Vector Machines Neural Networks). Thus, the confidence rating 5.64 given here as an example can be understood as a relative score between the presented imagelet to be OCR boosted and the learn set at its present state.
The confidence rating for each character within the learn set can be used as basis for a confusion matrix. The confusion matrix may be helpful to further refinement fuzzy searching approaches (e.g., by changing the N-gram statistics accordingly), dictionary lookups, or validation rule or regular expression based information retrieval from documents, as the confidence rating obtained by the OCR boosting can narrow down the number of potential characters to be checked (as only characters with high confidence ratings will be checked), and thus avoid combinatorial explosions. For example, for the character sequence “28/01/2009”, the OCR booster can return the following confidence ratings for characters 0-9 being an accurate character for the month position underlined above:
From the chart above, the amount of potential boosted characters to check can be reduced from ten (0, 1, 2, 3, 4, 5, 6, 7, 8 and 9) to two (0 and 9), because all the other characters have very low scores. Given the information that the data is a character sequence with a date of format XX/XX/XXXX, the “9” at the underlined position can be ruled out as well. Thus, for any further processing, the character of interest can be assumed to be only a “0”.
It should also be noted that OCR booster learn sets containing probability weighted 2D pixel information for the current document (or page, zone, etc.) can be generated and can be dynamically adapted while changing documents (or pages, zones, etc.) by keeping a history of imagelets and adding new ones successively. For example, the example OCR seed 310 of
In addition, it should be noted that the OCR booster learn sets containing confidence rated 2D pixel information can include the variance of the character pixels and the noise statistics of the whitespace surrounding the characters. Statistics about the background noise can be quite useful to devise filters to remove that noise. The OCR booster learn set can contain the statistics of the characters (e.g., their respective pixels) and the variance of the edges (examples of both shown in 320a and 320b). Additionally, the statistics of the background (e.g., the space besides the character-pixels) can be obtained. From this, statistics about speckle noise can be devised and utilized as additional input in despeckle filter design.
Furthermore, when OCR booster learning and application is performed on a single page, even rotation of the document can be incorporated into the OCR booster learn set, as the entire page can be rotated but not each character individually. This can result in a relative rotation between the characters within one page by 0 degrees, even when the entire page is rotated.
It should also be noted that, as discussed above with respect to 205 of
Furthermore, as the OCR learn set can be based on a smaller set, image distortion will not be as common. For example, if multiple pages are used to create the OCR learn set, as the pages are scanned, the pages often are not aligned perfectly, and thus the imagelets will have more variability, increasing the generalization that must be done. If however, only one paragraph of one page is used, there will not be image distortion as only one page is scanned. It should be noted that a high amount of generalization causes errors that a human would not likely make (e.g., mistaking a slightly distorted “8” for an “f”). In contrast, a low amount of generalization often causes errors that a human would make (e.g., mistaking a “1” (one) for an “l” (letter l). Making only errors that a human would make can increase acceptability of the system 100, as the system 100 would be no more inaccurate than a human, and would often cost much less to use than a human.
It should be noted that any of the information found utilizing the system 100 and method 200 above (e.g., the OCR seed, the OCR cleaned seed, the OCR learn set, as well as any information related to the mismatch distribution, OCR booster results, and confidence rating) can be stored (e.g., as the learn set, the imagelet collection or the respective statistics). This information can be re-used when an image from the same or similar class is to be reviewed. The information can be used as part or replacement of the initial OCR run, creating the OCR seed, or as a basis of the learn set for the document, thus increasing the reliability of the statistics. In addition, the overall procedure of the method 200 can be performed iteratively to allow for a refinement of the data, preprocessing methods, and/or other parameters.
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope of the present invention. Thus, the present invention should not be limited by any of the above-described exemplary embodiments.
In addition, it should be understood that the figures described above, which highlight the functionality and advantages of the present invention, are presented for example purposes only. The architecture of the present invention is sufficiently flexible and configurable, such that it may be utilized in ways other than that shown in the figures.
Further, the purpose of the Abstract of the Disclosure is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract of the Disclosure is not intended to be limiting as to the scope of the present invention in any way.
Finally, it is the applicant's intent that only claims that include the express language “means for” or “step for” be interpreted under 35 U.S.C. 112, paragraph 6. Claims that do not expressly include the phrase “means for” or “step for” are not to be interpreted under 35 U.S.C. 112, paragraph 6.
Number | Name | Date | Kind |
---|---|---|---|
4731861 | Blanton et al. | Mar 1988 | A |
4799188 | Yoshimura | Jan 1989 | A |
4864501 | Kucera et al. | Sep 1989 | A |
5191525 | LeBrun et al. | Mar 1993 | A |
5201047 | Maki et al. | Apr 1993 | A |
5245672 | Wilson et al. | Sep 1993 | A |
5267165 | Sirat | Nov 1993 | A |
5276870 | Shan et al. | Jan 1994 | A |
5278980 | Pedersen et al. | Jan 1994 | A |
5344132 | LeBrun et al. | Sep 1994 | A |
5537491 | Mahoney et al. | Jul 1996 | A |
5546503 | Abe et al. | Aug 1996 | A |
5550931 | Bellegarda et al. | Aug 1996 | A |
5619709 | Caid et al. | Apr 1997 | A |
5649068 | Boser et al. | Jul 1997 | A |
5671333 | Catlett et al. | Sep 1997 | A |
5675710 | Lewis | Oct 1997 | A |
5689620 | Kopec et al. | Nov 1997 | A |
5745889 | Burrows | Apr 1998 | A |
5778362 | Deerwester | Jul 1998 | A |
5787201 | Nelson et al. | Jul 1998 | A |
5794178 | Caid et al. | Aug 1998 | A |
5809476 | Ryan | Sep 1998 | A |
5864855 | Ruocco et al. | Jan 1999 | A |
5889886 | Mahoney | Mar 1999 | A |
5918223 | Blum et al. | Jun 1999 | A |
5937084 | Crabtree et al. | Aug 1999 | A |
5956419 | Kopec et al. | Sep 1999 | A |
5987457 | Ballard | Nov 1999 | A |
5999664 | Mahoney | Dec 1999 | A |
6006221 | Liddy et al. | Dec 1999 | A |
6009196 | Mahoney | Dec 1999 | A |
6043819 | LeBrun et al. | Mar 2000 | A |
6047299 | Kaijima | Apr 2000 | A |
6069978 | Peairs | May 2000 | A |
6076088 | Paik et al. | Jun 2000 | A |
6101515 | Wical et al. | Aug 2000 | A |
6115708 | Fayyad et al. | Sep 2000 | A |
6161130 | Horvitz et al. | Dec 2000 | A |
6185576 | McIntosh | Feb 2001 | B1 |
6188010 | Iwamura | Feb 2001 | B1 |
6192360 | Dumais et al. | Feb 2001 | B1 |
6212532 | Johnson et al. | Apr 2001 | B1 |
6243713 | Nelson et al. | Jun 2001 | B1 |
6275610 | Hall, Jr. et al. | Aug 2001 | B1 |
6289334 | Reiner et al. | Sep 2001 | B1 |
6324551 | Lamping et al. | Nov 2001 | B1 |
6327387 | Naoi et al. | Dec 2001 | B1 |
6453315 | Weissman et al. | Sep 2002 | B1 |
6477551 | Johnson et al. | Nov 2002 | B1 |
6574632 | Fox | Jun 2003 | B2 |
6611825 | Billheimer | Aug 2003 | B1 |
6622134 | Sorkin | Sep 2003 | B1 |
6629097 | Keith | Sep 2003 | B1 |
6665668 | Sugaya et al. | Dec 2003 | B1 |
6665841 | Mahoney et al. | Dec 2003 | B1 |
6732090 | Shanahan et al. | May 2004 | B2 |
6741724 | Bruce et al. | May 2004 | B1 |
6741959 | Kaiser | May 2004 | B1 |
6772164 | Reinhardt | Aug 2004 | B2 |
6785810 | Lirov | Aug 2004 | B1 |
6944340 | Shah | Sep 2005 | B1 |
6976207 | Rujan et al. | Dec 2005 | B1 |
6983345 | Lapir et al. | Jan 2006 | B2 |
6990238 | Saffer | Jan 2006 | B1 |
7149347 | Wnek | Dec 2006 | B1 |
7433997 | Lapir et al. | Oct 2008 | B2 |
7440938 | Matsubayashi et al. | Oct 2008 | B2 |
7472121 | Kothari et al. | Dec 2008 | B2 |
7483570 | Knight | Jan 2009 | B1 |
7509578 | Rujan et al. | Mar 2009 | B2 |
7610281 | Gandhi et al. | Oct 2009 | B2 |
7720721 | Goldstein et al. | May 2010 | B1 |
7805446 | Potok et al. | Sep 2010 | B2 |
7865018 | Abdulkader et al. | Jan 2011 | B2 |
7908430 | Lapir | Mar 2011 | B2 |
8015198 | Rabald et al. | Sep 2011 | B2 |
8051139 | Musat | Nov 2011 | B1 |
8554852 | Burnim | Oct 2013 | B2 |
20010042083 | Saito et al. | Nov 2001 | A1 |
20020022956 | Ukrainczyk et al. | Feb 2002 | A1 |
20020023085 | Keith, Jr. | Feb 2002 | A1 |
20020129015 | Caudill et al. | Sep 2002 | A1 |
20020133476 | Reinhardt | Sep 2002 | A1 |
20020156760 | Lawrence et al. | Oct 2002 | A1 |
20020156816 | Kantrowitz et al. | Oct 2002 | A1 |
20030099399 | Zelinski | May 2003 | A1 |
20040049411 | Suchard et al. | Mar 2004 | A1 |
20040243601 | Toshima | Dec 2004 | A1 |
20040255218 | Tada et al. | Dec 2004 | A1 |
20050021508 | Matsubayashi et al. | Jan 2005 | A1 |
20050160369 | Balabanovic et al. | Jul 2005 | A1 |
20060142993 | Menendez-Pidal et al. | Jun 2006 | A1 |
20060210138 | Hilton et al. | Sep 2006 | A1 |
20070033252 | Combest | Feb 2007 | A1 |
20070091376 | Calhoon et al. | Apr 2007 | A1 |
20070244882 | Cha et al. | Oct 2007 | A1 |
20070288882 | Kniffin et al. | Dec 2007 | A1 |
20080040660 | Georke et al. | Feb 2008 | A1 |
20080126335 | Gandhi et al. | May 2008 | A1 |
20080208840 | Zhang et al. | Aug 2008 | A1 |
20080212877 | Franco | Sep 2008 | A1 |
20080252924 | Gangai | Oct 2008 | A1 |
20090125529 | Vydiswaran et al. | May 2009 | A1 |
20090198677 | Sheehy et al. | Aug 2009 | A1 |
20090226090 | Okita | Sep 2009 | A1 |
20090228777 | Henry et al. | Sep 2009 | A1 |
20090274374 | Hirohata et al. | Nov 2009 | A1 |
20100325109 | Bai | Dec 2010 | A1 |
Number | Date | Country |
---|---|---|
0 320 266 | Jun 1989 | EP |
0 572 807 | Dec 1993 | EP |
0 750 266 | Dec 1996 | EP |
0809219 | Nov 1997 | EP |
1049030 | Nov 2000 | EP |
1 128 278 | Aug 2001 | EP |
1 182 577 | Feb 2002 | EP |
1 288 792 | Mar 2003 | EP |
1 315 096 | May 2003 | EP |
2172130 | Sep 1986 | GB |
01-277977 | Nov 1989 | JP |
02-186484 | Jul 1990 | JP |
04-123283 | Nov 1992 | JP |
07-271916 | Oct 1995 | JP |
10-91712 | Apr 1998 | JP |
11-184894 | Jul 1999 | JP |
11-184976 | Jul 1999 | JP |
2000-155803 | Jun 2000 | JP |
2003-524258 | Aug 2003 | JP |
2005-038077 | Feb 2005 | JP |
2009-238217 | Oct 2009 | JP |
WO 8801411 | Feb 1988 | WO |
WO 8904013 | May 1989 | WO |
WO 9110969 | Jul 1991 | WO |
WO 9801808 | Jan 1998 | WO |
WO 9847081 | Oct 1998 | WO |
WO 01 42984 | Jun 2001 | WO |
WO 0163467 | Aug 2001 | WO |
WO 0215045 | Feb 2002 | WO |
WO 03019524 | Mar 2003 | WO |
WO 03044691 | May 2003 | WO |
Entry |
---|
Translation for JP 2009-238217. |
Translation for JP 2005-038077. |
International Search Report issued in PCT/IB2010/003251, mailed May 2, 2011. |
Written Opinion issued in PCT/IB2010/003251, mailed May 2, 2011. |
International Search Report issued in PCT/IB2010/003250, mailed May 2, 2011. |
Written Opinion issued in PCT/IB2010/003250, mailed May 2, 2011. |
Suzanne Liebowitz Taylor et al., “Extraction of Data from Preprinted Forms,” Machine Vision and Applications, vol. 5, pp. 211-222, Jan. 1992. |
International Search Report issued in PCT/IB2010/050087, mailed May 27, 2011. |
Written Opinion issued in PCT/IB2010/050087, mailed May 27, 2011. |
File History of U.S. Appl. No. 12/106,450 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 12/610,915 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 12/610,937 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 12/570,412 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 13/024,086 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 13/192,703 electronically captured on Sep. 22, 2011. |
File History of U.S. Appl. No. 12/106,450 electronically captured on Nov. 10, 2010. |
Office Action issued in Canadian Patent Application No. 2,459,182, mailed Oct. 26, 2010. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Nov. 30, 2011 for Sep. 22, 2011 to Nov. 30, 2011. |
File History of U.S. Appl. No. 12/610,915 electronically captured on Nov. 30, 2011 for Sep. 22, 2011 to Nov. 30, 2011. |
File History of U.S. Appl. No. 12/208,088 electronically captured on Nov. 30, 2011 for May 12, 2011 to Nov. 30, 2011. |
Office Action issued in Canadian Application No. CA 2,419,776, dated Aug. 16, 2011. |
Notice of Allowance issued in Canadian Application No. 2,459,182, dated Oct. 28, 2011. |
International Search Report issued in PCT/IB2010/003252, mailed Jan. 24, 2012. |
Written Opinion issued in PCT/IB2010/003252, mailed Jan. 24, 2012. |
A. Dengal et al., “Chapter 8:: Techniques for Improving OCR Results”, Handbook of Character Recognition and Document Image Analysis, pp. 227-258, Copyright 1997 World Scientific Publishing Company. |
Remy Mullot, “Les Documents Ecrits”, Jan. 1, 2006, Lavoisier—Hermes Science, pp. 351-355, “Section 7.6.3 Reconnaissance des caracteres speciaux ou endommages”. |
L. Solan, “The Language of the Judges”, University of Chicago Press, pp. 45-54, Jan. 1, 1993. |
File History of U.S. Appl. No. 13/024,086 electronically captured on Mar. 8, 2012 for Jan. 11, 2012 to Mar. 8, 2012. |
File History of U.S. Appl. No. 12/208,088 electronically captured on May 9, 2012, for Jan. 12, 2012 to May 9, 2012. |
File History of U.S. Appl. No. 12/610,915 electronically captured on May 9, 2012, for Nov. 30, 2011 to May 9, 2012. |
File History of U.S. Appl. No. 12/610,937 electronically captured on May 9, 2012, for Sep. 23, 2011 to May 9, 2012. |
File History of U.S. Appl. No. 12/191,774 electronically captured on Sep. 16, 2010. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Sep. 16, 2010. |
File History of U.S. Appl. No. 12/106,450 electronically captured on May 12, 2011. |
File History of U.S. Appl. No. 10/204,756 electronically captured on May 12, 2011. |
File History of U.S. Appl. No. 10/208,088 electronically captured on May 12, 2011. |
File History of U.S. Appl. No. 13/024,086 electronically captured on May 12, 2011. |
European Office Action issued in EP 01127768.8, mailed Feb. 17, 2011. |
R.M, Lea et al., “An Associative File Store Using Fragments for Run-Time Indexing and Compression”, SIGIR'80, Proceedings of the 3rd Annual ACM Conference on Research and Development in information Retrieval, pp. 280-295 (1980). |
M. Koga et al., “Lexical Search Approach for Character-String Recognition”, DAS'98, LNCS 1655, pp. 115-129 (1999). |
File History of U.S. Appl. No. 12/191,774 electronically captured on Feb. 24, 2011. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Feb. 24, 2011. |
File History of U.S. Appl. No. 13/024,086 electronically captured on Feb. 24, 2011. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Jan. 11, 2012, for Dec. 1, 2011 to Jan. 11, 2012. |
File History of U.S. Appl. No. 12/208,088 electronically captured on Jan. 11, 2012, for Dec. 1, 2011 to Jan. 11, 2012. |
File History of U.S. Appl. No. 13/024,086 electronically captured on Jan. 11, 2012, for Sep. 23, 2011 to Jan. 11, 2012. |
File History of U.S. Appl. No. 13/192,703 electronically captured on Jan. 11, 2012, for Sep. 23, 2011 to Jan. 11, 2012. |
James Wnek, “learning to Identify Hundreds of Flex-Form Documents”, IS&T /SPIE Conference on Document Recognition and Retrieval VI, San Jose, CA, SPIE vol. 3651, pp. 173-182 (Jan. 1999). |
English translation of Remy MULLOT, “Les Documents Ecrits”, Jan. 1, 2006, Lavoisier—Hermes Science, pp. 351-355, “Section 7.6.3 Reconnaissance des caracteres speciaux ou endommages”. |
Dave, et al., Mining the Peanut Gallery: Opinion Extraction and Semantic Classification of Product Reviews, ACM 1-58113-680-03/03/0005, May 20-24, 2003, pp. 1-10. |
Dreier, Blog Fingerprinting: Identifying Anonymous Posts Written by an Author of Interest Using Word and Character Frequency Analysis, Master's Thesis, Naval Postgraduate School, Monterey, California, Sep. 2009, pp. 1-93. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Mar. 13, 2013 for Jan. 11, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 12/208,088 electronically captured on Mar. 13, 2013 for May 9, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 12/610,915 electronically captured on Mar. 13, 2013 for May 9, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 12/610,937 electronically captured on Mar. 13, 2013 for May 9, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 12/570,412 electronically captured on Mar. 13, 2013 for Sep. 22, 2011 to Mar. 13, 2013. |
File History of U.S. Appl. No. 13/024,086 electronically captured on Mar. 13, 2013 for Mar. 8, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 13/192,703 electronically captured on Mar. 13, 2013 for Jan. 11, 2012 to Mar. 13, 2013. |
File History of U.S. Appl. No. 13/477,645 electronically captured on Mar. 13, 2013. |
File History of U.S. Appl. No. 13/548,042 electronically captured on Mar. 13, 2013. |
File History of U.S. Appl. No. 13/624,443 electronically captured on Mar. 13, 2013. |
File History of U.S. Appl. No. 10/204,756 electronically captured on May 22, 2014 for Nov. 25, 2013 to May 22, 2014. |
File History of U.S. Appl. No. 12/610,915 electronically captured on May 22, 2014 for Nov. 25, 2013 to May 22, 2014. |
File History of U.S. Appl. No. 12/610,937 electronically captured on May 22, 2014 for Nov. 25, 2013 to May 22, 2014. |
File History of U.S. Appl. No. 13/192,703 electronically captured on May 22, 2014 for Nov. 25, 2013 to May 22, 2014. |
File History of U.S. Appl. No. 13/477,645 electronically captured on May 22, 2014 for Nov. 25, 2013 to May 22, 2014. |
Office Action issued in Canadian Application No. CA 2,796,392, mailed Apr. 30, 2014. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Nov. 25, 2013. |
File History of U.S. Appl. No. 12/610,915 electronically captured on Nov. 25, 2013. |
File History of U.S. Appl. No. 12/610,937 electronically captured on Nov. 25, 2013. |
File History of U.S. Appl. No. 13/192,703 electronically captured on Nov. 25, 2013. |
File History of U.S. Appl. No. 13/477,645 electronically captured on Nov. 25, 2013. |
File History of U.S. Appl. No. 09/214,682 electronically captured on Nov. 25, 2013. |
Office Action issued in Canadian Application No. CA 2,776,891, mailed Oct. 22, 2013. |
Office Action issued in Canadian Application No. CA 2,796,392, mailed Sep. 26, 2013. |
A. Krikelis et al., “Associative Processing and Processors” Computer, US, IEEE Cornputer Society, Long Beach, CA, US, vol. 27, No. 11, Nov. 1, 1994, pp. 12-17, XP000483129. |
Motonobu Hattori, “Knowledge Processing System Using Multi-Mode Associate Memory”, IEEE, vol. 1, pp. 531-536 (May 4-9, 1998). |
H. Saiga et al., “An OCR System for Business Cards”, Proc. of the 2nd Int Conf. on Document Analysis and Recognition, Oct. 20-22, 1993, pp. 802-805, XP002142832. |
Simon Berkovich et al., “Organization of Near Matching in Bit Attribute Matrix Applied to Associative Access Methods in information Retrieval”, Pro. of the 16th IASTED int Conf. Applied Informatics, Feb. 23-25, 1998, Garmisch-Partenkirchen, Germany. |
D. Frietag, “Information extraction from HTML: application of a general machine learning approach”, Pro. 15th National Conference on Artificial Intelligence (AAAI-98); Tenth Conference on Innovative Applications of Artificial Intelligence, Proceedings of the Fifteenth National Conference on Artificial Intelligence, Madison, WI, USA pp. 517-523, SP002197239 1998, Menlo Park, CA, USA, AAAI Press/MIT Press, USA ISBN: 0-262-51098-7. |
F. Aurenhammer, Voronol Diagrams—“A survey of a fundamental geometric data structure”, ACM Computing Surveys, vol. 23, No. 3, Sep. 1991, pp. 345-405. |
European Office Action issued in European Application No. 01 905 729.8, mailed Nov. 22, 2007. |
Witten et al., “Managing Gigabytes” pp. 128-142. |
Bo-ren Bai et al. “Syllable-based Chinese text/spoken documents retrieval using text/speech queries”, Int'l Journal of Pattern Recognition. 14.05 (2000): 603-616. |
Foreign Search Search Report issued in EP 01127768.8 mailed Sep. 12, 2002. |
European Office Action issued in EP 97931718.7, mailed May 8, 2002. |
File History of U.S. Appl. No. 11/240,525 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 09/561,196 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 11/240,632 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 10/362,027 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/191,774 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/106,450 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 10/204,756 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/208,088 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/610,915 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/610,937 electronically captured on May 19, 2010. |
File History of U.S. Appl. No. 12/570,412 electronically captured on May 27, 2010. |
File History of U.S. Appl. No. 10/204,756 electronically captured on Sep. 12, 2014 for May 22, 2014 to Sep. 12, 2014. |
File History of U.S. Appl. No. 12/610,915 electronically captured on Sep. 12, 2014 for May 22, 2014 to Sep. 12, 2014. |
File History of U.S. Appl. No. 12/610,937 electronically captured on Sep. 12, 2014 for May 22, 2014 to Sep. 12, 2014. |
File History of U.S. Appl. No. 13/192,703 electronically captured on Sep. 12, 2014 for May 22, 2014 to Sep. 12, 2014. |
File History of U.S. Appl. No. 13/477,645 electronically captured on Sep. 12, 2014 for May 22, 2014 to Sep. 12, 2014. |
File History of U.S. Appl. No. 13/624,443 electronically captured on Sep. 12, 2014 for Mar. 13, 2013 to Sep. 12, 2014. |
Office Action issued in Japanese Application No. 2012-532203, mailed Jul. 15, 2014. |
Office Action issued in Canadian Application No. 2,776,891 dated May 13, 2014. |
English language translation of Office Action issued in Japanese Application No. 2012-537459, mailed Jun. 17, 2014. |
English language abstract and translation of JP 10-091712 published Apr. 10, 1998. |
English language abstract and translation of JP 2003-524258 published Aug. 12, 2003. |
English language abstract of JP 1-277977 published Nov. 8, 1989. |
English language abstract of JP 2-186484 published Jul. 20, 1990. |
English language abstract and translation of JP 7-271916 published Oct. 20, 1995. |
English language abstract and translation of JP 11-184976 published Jul. 9, 1999. |
English language abstract and translation of JP 2000-155803 published Jun. 6, 2000. |
Partial English language translation of Office Action issued in Japanese Application No. 2012-532203, mailed Jul. 15, 2014. |
Office Action issued in Japanese Application No. JP 2012-537457 dated Apr. 18, 2014. |
English language translation of Office Action issued in Japanese Application No. JP 2012-537457 dated Apr. 18, 2014. |
English language abstract of JP 2009-238217, published Oct. 15, 2009. |
English language abstract and machine translation of JP 2005-038077, published Feb. 10, 2005. |
G. Salton et al., “A Vector Space Model for Automatic Indexing”, Communications of the ACM, vol. 18, No. 11, pp. 613-620, Nov. 1975. |
Office Action issued in Australian Application No. 2010311066 dated Oct. 29, 2014. |
Suzanne Liebowitz Taylor et al., “Extraction of Data from Preprinted Forms”, Machine Vision and Applications, vol. 5, No. 3, pp. 211-222 (1992). |
Office Action issued in Australian Application No. 2010311067 dated Oct. 21, 2014. |
Office Action issued in Japanese Application No. 2012-537458 mailed Jun. 17, 2014. |
English language translation of Office Action issued in Japanese Application No. 2012-537458, mailed Jun. 17, 2014. |
Office Action issued in Australian Application No. 2013205566 dated Dec. 9, 2014. |
Stephen V. Rice et al., “The Fifth Annual Test of OCR Accuracy”, Information Science Research institute, TR-96-01, Apr. 1996 (48 pages). |
Thomas A. Nartker et al., “Software Tools and Test Data for Research and Testing of Page-Reading OCR Systems”, IS&T/SPIE Symposium on Electronic Imaging Science and Technology, Jan. 2005 (11 pages). |
Tim Kam Ho et al., “OCR with No Shape Training”, International Conference on Pattern Recognition (2000) (4 pages). |
Karen Kukich, “Techniques for Automatically Correcting Words in Text”, ACM Computing Surveys, vol. 24, No. 4, pp. 377-439, Dec. 1992. |
Atsuhiro Takasu, “An Approximate Multi-Word Matching Algorithm for Robust Document Retrieval”, CIKM'06, Nov. 5-11, 2006 (9 pages). |
Kenneth Ward Church et al., “Word Association Norms, Mutual Information, and Lexicography”, Computational Linguistics, vol. 16, No. 1, pp. 22-29, Mar. 1990. |
Anil N. Hirani et al., “Combining Frequency and Spatial Domain Information for Fast Interactive Image Noise Removal”, Proceedings of the 23rd Annual Conference on Computer Graphics and interactive Techniques, pp. 269-276 (1996). |
Leonid L. Rudin et al., “Nonlinear Total Variation Based Noise Removal Algorithms”, Physica D, vol. 60, pp. 259-268 (1992). |
Richard Alan Peters, II, “A New Algorithm for Image Noise Reduction Using Mathematical Morphology”, IEEE Transactions on Image Processing, vol. 4, No. 3, pp. 554-568, May 1995. |
George Nagy, “Twenty Years of Document Image Analysis in PAMI”, IEEE Transactions on Pater Analysis and Machine Intelligence, vol. 22, No. 1, pp. 38-62, Jan. 2000. |
Stephen V. Rice et al., “Optical Character Recognition: An Illustrated Guide to the Frontier”, Kluwer Academic Publishers, Copyright Apr. 1999. |
T.A. Bayer et al., “A Generic System for Processing Invoices”, 4th Int. Conf. on Document Analysis and Recognition, pp. 740-744 (1997), Aug. 1997. |
D.P. Lopresti et al., “A Fast Technique for Comparing Graph Representations with Applications To Performance Evaluation”, Int. Journal on Document Analysis and Recognition (IJDAR), vol. 6, pp. 219-229 (2004), Feb. 2004. |
Takashi Hirano et al., “Field Extraction Method from Existing Forms Transmitted by Facsimile”, In Proceedings of the Sixth International Conference on Document Analysis and Recognition (ICDAR), pp. 738-742, Sep. 10-13, 2001. |
Donato Malerba et al., “Automated Discovery of Dependencies Between Logical Components in Document Image Understanding”, In Proceedings of the Sixth International Conference on Document Analysis and Recognition (ICDAR), Sep. 10-13, 2001 (5 pages). |
Frederick Schultz et al., “Seizing the Treasure: Transferring Layout Knowledge in Invoice Analysis”, Proceedings of the 10th International Conference on Document Analysis and Recognition (ICDAR) (2009), Jul. 2009. |
“Brainware: Invoice Processing”, downloaded from Internet Archive at http://web.archive.org/web/20080512043505/http://www.brainware.com/invoice.php, archived May 12, 2008 (1 page). |
“Kofax: Solutions: Invoice Processing”, downloaded from Internet Archive at http://web.archive.org/web/20081012062841/http://www.kofax.com/solutions/invoice-processing.asp, archived Oct. 12, 2008 (1 page). |
“Kofax: Solutions: Kofax White Papers”, downloaded from Internet Archive at http://web.archive.org/web/20080928052442/http://www.kofax.com/solutions/white-papers.asp, archived Sep. 28, 2008 (2 pages). |
A. Krikelis et al., “Associative Processing and Processors” Computer, US, IEEE Computer Society, Long Beach, CA, US, vol. 27, No. 11, Nov. 1, 1994, pp. 12-17, XP000483129. |
International Search Report issued in related International Application No. PCT/EP01/09577, mailed Nov. 5, 2003. |
Motonobu Hattori, “Knowledge Processing System Using Mulit-Mode Associate Memory” IEEE, vol. 1, pp. 531-536 (May 4-9, 1998). |
International Search Report issued in International Application No. PCT/EP01/01132, mailed May 30, 2001. |
H. Saiga et al., “An OCR System for Business Cards”, Proc. of the 2nd Int. Conf. on Document Analysis and Recognition, Oct. 20-22, 1993, pp. 802-805, XP002142832. |
Junliang Xue et al., “Destination Address Block Location on handwritten Chinese Envelope”, Proc. of the 5th Int. Conf. on Document Analysis and Recognition, Sep. 20-22, 1999, pp. 737-740, XP002142833. |
Machine Translation of JP 11-184894. |
Simon Berkovich et al., “Organization of Near Matching In Bit Attribute Matrix Applied to Associative Access Methods In information Retrieval”, Pro. Of the 16th IASTED Int. Conf. Applied Informatics, Feb. 23-25, 1998, Garmisch-Partenkirchen, Germany. |
European Office Action issued in Application No. 01127768.8 mailed Sep. 10, 2008. |
European Office Action issued in Application No. 01120429.4 mailed Sep. 16, 2008. |
D. Frietag, “Information extraction from HTML: application of a general machine learning approach”, Pro. 15th National Conference on Artificial Intelligence (AAAI-98); Tenth Conference on Innovative Applications of Artificial Intelligence, Proceedings of the Fifteenth National Conference on Artificial Intelligence, Madison, WI, USA, pp. 517-523, SP002197239 1998, Menlo Park, CA, USA, AAAI Press/MIT Press, USA ISBN: 0-262-51098-7. |
European Office Action issued in Application No. 01120429.4 mailed Jun. 30, 2006. |
European Office Action issued in Application No. 01120429.4 mailed Apr. 23, 2004. |
E. Appiani et al., “Automatic document classification and indexing in high-volume applications”, International Journal on Document Analysis and Recognition, Dec. 2001, Springer-Verlag, Germany, vol. 4, No. 2, pp. 69-83, XP002197240, ISSN: 1433-2833. |
A. Ting et al., “Form recognition using linear structure”, Pattern Recognition, Pergamon Press Inc., Elmsford, NY, US, vol. 32, No. 4, Apr. 1999, pp. 645-656, XP004157536, ISSN: 0031-3203. |
International Search Report issued in International Application PCT/US02/27132 issued Nov. 12, 2002. |
“East text search training”, Jan. 2000. |
European Search Report issued in European Office Action 01120429.4 mailed Jul. 19, 2002. |
International Search Report issued in International Application PCT/DE97/01441, mailed Nov. 12, 1997. |
Voorheas et al., “Vector expansion in a large collection”, NIST Special Publication, US, Gaithersburg, MD pp. 343-351. |
M. Marchard et al., “A Convergence Theorem for Sequential Laming in Two-Layer Perceptrons”, Europhysics Letters, vol. 11, No. 6, Mar. 15, 1990, pp. 487-492. |
F. Aurenhammer, Voronoi Diagrams —“A survey of a fundamental geometric data structure”, ACM Computing Surveys, vol. 23, No. 3, Sep. 1991, pp. 345-405. |
C. Reyes et al., “A Clustering Technique for Random Data Classification”, International Conference on Systems, Man and Cybernetics, IEEE, p. 316-321. |
International Search Report issued in PCT/EP00/03097 mailed Sep. 14, 2000. |
International Preliminary Examination Report issued in PCT/EP00/03097 mailed Jul. 31, 2001. |
Written Opinion issued in PCT/EP00/03097 mailed Apr. 21, 2001. |
Japanese Office Action issued in Japanese Application No. 2003-522903, mailed Jul. 29, 2008. |
English language translation of Japanese Office Action issued in Japanese Application No. 2003-522903, mailed Jul. 29, 2008. |
European Office Action issued in European Application No. 01 905729.8, mailed Nov. 22, 2007. |
Foreign Office Action issued in EP 00117850.8 mailed May 20, 2008. |
Foreign Office Action issued in EP 00117850.8 mailed Jul. 20, 2006. |
EP Search Report issued in EP 00117850.8 mailed Jun. 12, 2001. |
International Notification issued in PCT/EP00/03097 mailed May 5, 2001. |
G. Lapir, “Use of Associative Access Method for Information Retrieval System”, Proc. 23rd Pittsburg Conf. on Modeling & Simulation, 1992, pp. 951-958. |
Witten et al., “Managing Gigabytes” pp. 128-142, 1999. |
C.J. Date, “An Introduction to Database Systems, Sixth Edition”, Addison-Wesley Publishing Company, pp. 52-65 (1995). |
International Search Report for PCT/US00/23784 mailed Oct. 26, 2000. |
Australian Office Action in Australian application 2002331728 mailed Nov. 16, 2006. |
Australian Notice of Acceptance issued in Australian application 2002331728 mailed Feb. 20, 2008. |
Foreign Office Action issued in EP 01127768.8 mailed Feb. 5, 2007. |
Foreign Office Action issued in EP 01127768.8 mailed Sep. 8, 2005. |
Bo-ren Bai et al. “Syllable-based Chinese text/spoken documents retrieval using text/speech queries”, Int'l Journal of Pattern Recognition, 2000. |
Foreign Search Report issued in EP 01127768.8 mailed Sep. 12, 2002. |
European Search Report issued in EP 00103810.8, mailed Aug. 1, 2000. |
European Office Action issued in EP 00103810.8, mailed May 23, 2002. |
International Preliminary Examination Report issued in International Application No. PCT/DE97/01441, mailed Jul. 21, 1998. |
European Office Action issued in EP 00926837.6, mailed Nov. 28, 2006. |
European Office Action issued in EP 00926837.6, mailed Oct. 11, 2007. |
Australian Office Action issued in AU 2001282106, mailed Jul. 18, 2006. |
Australian Office Action issued in AU 2001233736, mailed Aug. 26, 2005. |
Australian Office Action issued in AU 2001233736, mailed Aug. 23, 2006. |
European Office Action issued in EP 97931718.7, mailed Jul. 9, 2001. |
European 2002 Office Action issued in EP 97931718.7, mailed May 8, 2002. |
International Search Report issued in International Application No. PCT/EP98/00932, mailed Jul. 28, 1998. |
Richard G. Casey et al., “A Survey of Methods and Strategies in Character Segmentation”, IEEE Transactions on Pattern Analysis and machine Intelligence, vol. 18, No. 7, pp. 690-706 (Jul. 1996). |
U.S. Appl. No. 11/240,525. |
U.S. Appl. No. 09/561,196. |
U.S. Appl. No. 11/240,632. |
U.S. Appl. No. 10/362,027. |
U.S. Appl. No. 12/191,774. |
U.S. Appl. No. 12/106,450. |
U.S. Appl. No. 12/208,088. |
U.S. Appl. No. 12/610,915. |
U.S. Appl. No. 12/570,412. |
English language translation of Office Action issued in Japanese Application No. 2012-537458 mailed Jun. 17, 2014. |
Andreas R. Dengel et al., “smartFIX: A Requirement Driven System for Document Analysis and Understanding”, In: D. Lopresti et al., Document Analysis V, Springer, pp. 433-444 (Copyright 2002). |
Bertin Klein et al., “On Benchmarking, of Invoice Analysis Systems” DAS 2006, LNCS 3872, pp. 312-323 (2006), Feb. 2006. |
U.S. Appl. No. 13/192,703. |
U.S. Appl. No. 13/548,042. |
U.S. Appl. No. 10/204,756. |
U.S. Appl. No.12/610,915. |
U.S. Appl. No. 12/610,937. |
U.S. Appl. No. 13/477,645. |
U.S. Appl. No. 13/624,443. |
Number | Date | Country | |
---|---|---|---|
20110103688 A1 | May 2011 | US |