The embodiments described herein relate to processing images captured using a mobile device, and more particularly to developing standards for processing the images and assessing an image processing system to determine whether it meets the standards.
The use of mobile devices to capture images is ubiquitous in modern culture. Aside from taking pictures of people, places and events, users are utilizing the cameras on their devices for many different purposes. One of those purposes is capturing images of content that the user later wants to review, such as a description of a product, a document, license plate, etc.
Financial institutions and other businesses have become increasingly interested in electronic processing of checks and other financial documents in order to expedite processing of these documents. Some techniques allow users to scan a copy of the document using a scanner or copier to create an electronic copy of the document that can be digitally processed to extract content. This provides added convenience over the use of a hardcopy original, which would otherwise need to be physically sent or presented to the recipient for processing. For example, some banks can process digital images of checks and extract check information from the image needed to process the check for payment and deposit without requiring that the physical check be routed throughout the bank for processing.
Mobile devices that incorporate cameras are ubiquitous and may also be useful to capture images of financial documents for mobile processing of financial information. The mobile device may be connected with a financial institution or business through a mobile network connection. However, the process of capturing and uploading images of financial documents using the mobile device is often prone to error, producing images of poor quality which cannot be used to extract data. The user is often unaware of whether the captured document image is sufficient and capable for processing by a business or financial institution.
Attempts have been made to improve the quality of mobile images of financial documents to improve the accuracy of information extracted therefrom. There are numerous ways in which an image can be improved for extracting its content, some of which are implemented individually and some of which are implemented together. However, it is difficult to determine which methods are the best at improving image quality and content extraction. Of the methods often used, it is even more difficult to select a threshold of that particular method which will provide an accurate capture in as little time as possible. Finally, determining whether an image processing system is capable of performing adequate image capture, processing and content extraction has not been explored.
Therefore, there is a need for identifying image processing techniques which will provide optimal image correction and accurate content extraction.
Embodiments described herein provide methods for defining and determining a formal and verifiable mobile document image quality and usability (MDIQU) standard, or Standard for short. The goal of this Standard is to ensure that a mobile image can be used in an appropriate mobile document processing application, for example an application for mobile check deposit, mobile bill pay, mobile balance transfer or mobile insurance submission and application. In order to quantify the usability, the Standard establishes 5 quality and usability grades: the higher grade images will tend to produce higher accuracy results in the related “mobile” application. A mobile image is first tested to determine if the quality is sufficient to obtain content from the image by performing multiple different image quality assessment tests. If the image quality is sufficient, one or more document usability computations are made to determine if the document or content in the image is usable by a particular application. A ranking is then assigned to the image based on the results of the tests which is indicative of the quality and usability of the image.
Systems and methods are provided for developing standards of image processing for mobile image capture and assessing whether mobile processing engines meet the standards. A mobile processing engine (MDE) is evaluated to determine if it can perform one or more technical capabilities for improving the quality of and extracting content from an image of a financial document (such as a check). A verification process then begins, where the MDE performs the image quality enhancements and text extraction steps on sets of images from a test deck of good and bad images of financial documents with known content. The results of the processing of the test deck are then evaluated by comparing confidence levels with thresholds to determine if each set of images should be accepted or rejected. Further analysis determines whether any of the sets of images were falsely accepted or rejected, and an overall error rate is computed. The overall error rate is then compared with minimum accuracy criteria, and if the criteria are met, the MDE meets the standard for mobile deposit
The goal of this Standard is to ensure that a mobile image can be used in an appropriate mobile document processing application, primarily Mobile Deposit [1], Mobile Bill Pay [2], Mobile Balance Transfer [3] and Mobile Insurance [4].
Other features and advantages should become apparent from the following description of the preferred embodiments, taken in conjunction with the accompanying drawings.
Various embodiments disclosed herein are described in detail with reference to the following figures. The drawings are provided for purposes of illustration only and merely depict typical or exemplary embodiments. These drawings are provided to facilitate the reader's understanding and shall not be considered limiting of the breadth, scope, or applicability of the embodiments. It should be noted that for clarity and ease of illustration these drawings are not necessarily made to scale.
The various embodiments mentioned above are described in further detail with reference to the aforementioned figured and the following detailed description of exemplary embodiments.
I. Summary of Mobile Deposit Standards
Embodiments described herein define formal and verifiable mobile document image quality and usability (MDIQU) standard, of Standard for short. The goal of this Standard is to ensure that a mobile image can be used in an appropriate mobile document processing application, primarily mobile check deposit, mobile bill pay, mobile balance transfer or mobile insurance submission and application, although this list is only illustrative of the types of images and documents which may take advantage of the methods.
In order to quantify the usability, the Standard establishes 5 quality and usability grades: the higher grade images will tend to produce higher accuracy results in the related “mobile” application. The standards for various mobile document processing applications may use the ranking system established by this invention in order to define the various accuracy levels. For example, a mobile deposit standard may require MICR accuracy of 99% on mobile images of checks which are Grade 1 (the highest) according to this Standard, 97% on Grade 2 mobile images etc. The lowest grade (or 2 lowest grades, depending on application preferences) defines unusable images for which no meaningful prediction of accuracy could be made.
Given an application (e.g. Mobile Deposit) and a mobile image, MDIQU answers two basic questions:
Moreover, if the image quality is characterized as bad, MDIQU will detect a particular quality defect.
The answer on the first question (Image Quality) doesn't depend on the type of document or application in question. All that MDIQU is supposed to verify is that a human would easily read the relevant data off the mobile image. For example, a good quality mobile image of a check must contain the entire check (all 4 corners), be high contrast/sharp with all the characters being legible. The same will apply to mobile images of a bills, driver's licenses etc.
The answer on the second question (Document Validity) is supposed to ensure that the document within the image is indeed a check, a bill, a driver's license etc. Therefore, the validity factor is defined for each application independently.
II. MDIQU System and Workflow
100—accept mobile image under examination and an expected document type
200—compute individual IQA scores
300—output individual IQA scores
400—compute compound IQA scores (see Section 4)
450—output compound IQA scores
600—make decision if image quality is sufficient to proceed
700—if “No”, assign image the lowest rank and exit workflow (in this case, the document can't be even extracted from the image).
750—if “Yes,” receive document category and optionally set of critical fields
800—compute usability score from document category and critical fields
850—output document usability score
900—compares usability score to pre-defined threshold
1000—if score is low, assign Rank 3 of Rank 4, then exit.
1100—if score is high, assign Rank 1 of Rank 2, then exit.
One embodiment of an MDIQU system is illustrated in
III. Individual Mobile Image Quality Tests
In one embodiment, the MDIQU system performs the following image quality tests (Mobile IQA tests) to detect the following image deficiencies: Out-of-focus (OOF), Shadow, Small size, View Skew (perspective distortion), Plain Skew, Warped, Low Internal Contrast, Reflection, Cut Corners, Cut Sides, Busy Background and Low External Contrast. Each score should be returned on the scale of 0 to 1000, when 0 means “No defect” and 1000 means “Severe Defect”. Alternately, MDIQU system may detect absence of the defects, thus swapping 0 and 1000.
U.S. Pat. No. 8,582,862, issued Nov. 12, 2013, gives a description of the Mobile IQA tests and corresponding algorithms, and is herein incorporated by reference in its entirety. Below is a brief description of IQA tests along with their relative importance.
Examples of Individual Mobile Defects
Reflection IQA testing doesn't work well on the image level and needs to be “localized” to the field level.
These IQA scores may be averaged or combined to determine a total score or weighted depending on the relevance of a particular IQA test. Nevertheless, a simple threshold value may be set to determine whether the image has sufficient quality to proceed to the document usability tests.
IV. Compound IQA Scores
Embodiments of the invention use individual mobile IQA scores to automatically create two compound IQA scores which will ensure that the mobile document could be detected in the mobile image as well as that the document is fully legible. The first compound IQAs will be titled “Crop IQA” and “Quality IQA.”
Benefits of Compound IQAs Over “Individual” IQAs
Simplicity: There are more than a dozen of individual IQAs and in essence they are being replaced by 2 compounded ones: “crop” and “quality”. It's much better to move two dials than a dozen.
Eliminating wrong messages: In reality, some individual IQAs are strongly correlated between themselves and with other defects. For example, shadow and darkness (2 different IQAs) may cause low OOF (another IQA) as also can small image (yet another IQA); wrong crop (has its own IQA) can cause wrong identification and therefore present the image as “unsupported” and so on. By compounding related IQAs we avoid potential misclassifications and get rid of this hierarchy and thresholds altogether. MIP will have a much smaller set of thresholds (3) and the error classifier will be well-tested and optimized in R&D.
Description of Compound IQA Scores
The compound IQA scores are linear combinations of individual IQA score. The coefficients in combination depend on application as different individual IQA score have different importance for different document categories, see table 2.
V. Document Usability
Embodiments of the MDIQU system provide document usability testing to ensure that the document within the image is indeed a check, a bill, a driver's license etc.
The following categories may be supported: US Checks (Mobile Deposit), Remittance Coupon (Mobile Bill Pay), Credit Card Bills (Mobile Balance Transfer), and Driver's License (Mobile Insurance).
Definition of Critical Fields
The usability testing is based on a definition of Critical (or Required) Fields, see
Computation of Usability Score
The usability computation involves data capture from the document image cropped out of the mobile image. Present MDIQU system uses a Mobile Preprocessing Engine and Dynamic Data Capture engine described in U.S. Pat. No. 8,379,914, which is incorporated herein by reference in its entirety.
The Document usability score is computed based on the set of captured critical fields or/and the confidence of each of such fields. As with the image quality test, the document usability tests results in a yes or no answer to the question of whether the document is usable, and will then further analyze the response to assign a grade value to the document based on how usable (or un-usable) it is.
Computer-Implemented Embodiment
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not of limitation. The breadth and scope should not be limited by any of the above-described exemplary embodiments. Where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future. In addition, the described embodiments are not restricted to the illustrated example architectures or configurations, but the desired features can be implemented using a variety of alternative architectures and configurations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated example. One of ordinary skill in the art would also understand how alternative functional, logical or physical partitioning and configurations could be utilized to implement the desired features of the described embodiments.
Furthermore, although items, elements or components may be described or claimed in the singular, the plural is contemplated to be within the scope thereof unless limitation to the singular is explicitly stated. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.
The present application is a continuation of U.S. patent application Ser. No. 15/397,639, filed on Jan. 3, 2017, which is a continuation of U.S. patent application Ser. No. 14/217,170, filed on Mar. 17, 2014, now issued as U.S. Pat. No. 9,536,139, which claims priority to provisional application No. 61/802,039, filed on Mar. 15, 2013, all which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5966473 | Takahashi et al. | Oct 1999 | A |
6038351 | Rigakos | Mar 2000 | A |
7426316 | Vehvilinen | Sep 2008 | B2 |
7584128 | Mason et al. | Sep 2009 | B2 |
7735721 | Ma et al. | Jun 2010 | B1 |
7869098 | Corso et al. | Jan 2011 | B2 |
7873200 | Oakes, III et al. | Jan 2011 | B1 |
7876949 | Oakes, III et al. | Jan 2011 | B1 |
7978900 | Nepomniachtchi et al. | Jul 2011 | B2 |
8358826 | Medina, III et al. | Jan 2013 | B1 |
8538124 | Harpel et al. | Sep 2013 | B1 |
8959033 | Oakes, III et al. | Feb 2015 | B1 |
9058512 | Medina, III | Jun 2015 | B1 |
9613258 | Chen | Apr 2017 | B2 |
9842331 | Nepomniachtchi | Dec 2017 | B2 |
20040017947 | Yang | Jan 2004 | A1 |
20040037448 | Brundage | Feb 2004 | A1 |
20040109597 | Lugg | Jun 2004 | A1 |
20050097046 | Singfield | May 2005 | A1 |
20050180661 | El Bernoussi et al. | Aug 2005 | A1 |
20050220324 | Klein | Oct 2005 | A1 |
20060045322 | Clarke et al. | Mar 2006 | A1 |
20060045344 | Paxton et al. | Mar 2006 | A1 |
20060177118 | Ibikunle et al. | Aug 2006 | A1 |
20070009155 | Potts et al. | Jan 2007 | A1 |
20070086643 | Spier et al. | Apr 2007 | A1 |
20070154071 | Lin et al. | Jul 2007 | A1 |
20070168382 | Tillberg et al. | Jul 2007 | A1 |
20070206877 | Wu et al. | Sep 2007 | A1 |
20070211964 | Agam et al. | Sep 2007 | A1 |
20070288382 | Narayanan et al. | Dec 2007 | A1 |
20070297664 | Blaikie | Dec 2007 | A1 |
20080174815 | Komaki | Jul 2008 | A1 |
20080183576 | Kim et al. | Jul 2008 | A1 |
20090063431 | Erol | Mar 2009 | A1 |
20090185736 | Nepomniachtchi | Jul 2009 | A1 |
20090185737 | Nepomniachtchi | Jul 2009 | A1 |
20090198493 | Hakkani-Tur et al. | Aug 2009 | A1 |
20100102119 | Gustin et al. | Apr 2010 | A1 |
20100114765 | Gustin et al. | May 2010 | A1 |
20100114766 | Gustin et al. | May 2010 | A1 |
20100114771 | Gustin et al. | May 2010 | A1 |
20100114772 | Gustin et al. | May 2010 | A1 |
20100200660 | Moed et al. | Aug 2010 | A1 |
20100208282 | Isaev | Aug 2010 | A1 |
20110013822 | Meek et al. | Jan 2011 | A1 |
20110052065 | Nepomniachtchi et al. | Mar 2011 | A1 |
20110091092 | Nepomniachtchi et al. | Apr 2011 | A1 |
20110280450 | Nepomniachtchi | Nov 2011 | A1 |
20120010885 | Hakkani-Tr et al. | Jan 2012 | A1 |
20120030104 | Huff et al. | Feb 2012 | A1 |
20120033892 | Blenkhorn et al. | Feb 2012 | A1 |
20120070062 | Houle et al. | Mar 2012 | A1 |
20120150773 | DiCorpo et al. | Jun 2012 | A1 |
20120197640 | Hakkani-Tr et al. | Aug 2012 | A1 |
20120230577 | Calman et al. | Sep 2012 | A1 |
20120278336 | Malik et al. | Nov 2012 | A1 |
20130058531 | Hedley et al. | Mar 2013 | A1 |
20130120595 | Roach | May 2013 | A1 |
20140040141 | Gauvin et al. | Feb 2014 | A1 |
20140126790 | Duchesne et al. | May 2014 | A1 |
20140133767 | Lund et al. | May 2014 | A1 |
Number | Date | Country |
---|---|---|
10-2007-0115834 | Dec 2007 | KR |
Entry |
---|
International Search Report issued in related International Application No. PCT/US2011/056593 dated May 30, 2012 (3 pages). |
Office Action for related CA Patent Application No. 2,773,730, dated Aug. 21, 2017, in 4 pages. |
Number | Date | Country | |
---|---|---|---|
20180025223 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
61802039 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15397639 | Jan 2017 | US |
Child | 15721160 | US | |
Parent | 14217170 | Mar 2014 | US |
Child | 15397639 | US |