Systems and methods for mobile image capture and processing of documents

Information

  • Patent Grant
  • 12014350
  • Patent Number
    12,014,350
  • Date Filed
    Thursday, November 19, 2020
    4 years ago
  • Date Issued
    Tuesday, June 18, 2024
    6 months ago
Abstract
Techniques for processing images of documents captured using a mobile device are provided. The images can include different sides of a document from a mobile device for an authenticated transaction. In an example implementation, a method incudes inspecting the images to detect a feature associated with a first side of the document. In response to determining an image is the first side of the document, a type of content is selected to be analyze on the image of the first side and one or more of regions of interests (ROIs) are identified on the image of the first side that are known to include the selected type of content. A process can include receiving a sub-image of the image of the first side from the preprocessing unit, and performing content detection test on the sub-image.
Description
BACKGROUND
1. Technical Field

The embodiments described herein relate to processing images of documents captured using a mobile device, and more particularly to real-time detection of the content of a financial document for use in mobile banking applications.


2. Related Art

Banks and other businesses have become increasingly interested in electronic processing of check and other financial documents in order to expedite processing of these documents. Users can scan a copy of the document using a scanner or copier to create an electronic copy of the document that can be processed instead of a hardcopy original which would otherwise need to be physically sent to the recipient for processing. For example, some banks can process digital images of checks and extract check information from the image needed to process the check without requiring that the physical check be routed throughout the bank for processing. However, the type of information and the accuracy of information which can be processed from an image of a check are limited. As a result, some checks cannot be processed and are rejected during the mobile deposit process.


Mobile phones that incorporate cameras have also become ubiquitous and may also be useful to capture images of financial documents for mobile processing of financial information through a mobile network connection with a banking institution or business. However, the process of uploading images of financial documents is often prone to user error, as the user is often unaware of whether the document is complete and ready for processing by a business or financial institution. A user with a mobile phone or portable electronic device cannot interact with an employee of the bank in order to determine whether the financial document is ready to be processed.


Therefore, there is a need for the ability to obtain additional information from a digital image of a check which has been captured by a mobile device in order to streamline mobile banking applications and reduce the amount of transaction errors when a document is incomplete.


SUMMARY

Systems and methods for processing an image of a check captured using a mobile device, such as a mobile phone, are provided. These techniques can be implemented on a mobile device and at a central server, and can be used to identify content on a check and determine whether the check is ready to be processed by a business or financial institution. The system can identify portions of the check—such as the endorsement area—to determine if the check has been properly endorsed. If the check lacks an endorsement, a real time notification can be provided to a user which uploaded the check image so the user can correctly endorse the check and upload a new check image. Additional portions of the check, including the signature line, the addressee field, etc. can be checked to ensure that the check is ready to be deposited by the bank.


According to one embodiment, a mobile check image processing system for identifying content in regions of an image of a check captured by a mobile device is provided. The system may comprise a preprocessing unit which receives an image of a check from a mobile device, selects a type of content to be detected on the check image, and identifies at least one region of interest (ROI); a testing unit which performs at least one content detection test on the at least one ROI to detect content of the selected type on the check image; and a feedback unit which receives a result of the at least one content detection test and provides the result to a user. The ROI may correspond to an endorsement region on a back side of the check.


According to another embodiment, a method of identifying content in regions of an image of a check captured by a mobile device is provided. The method may comprise receiving an image of a check from a mobile device; selecting a type of content to be detected on the check image; identifying at least one region of interest (ROI) on the check image; performing at least one content detection test on the at least one ROI to detect content of the selected type on the check image; and providing a result of the at least one content detection test to a user. The ROI may correspond to an endorsement region on a back side of the check.


According to yet another embodiment, a computer-implemented method for processing a mobile-captured image of a check embodied on a computer with a processor and a memory may comprise receiving an image of a check from a mobile device; selecting a type of content to be detected on the check image; identifying at least one region of interest (ROI) on the check image; performing at least one content detection test on the at least one ROI to detect content of the selected type on the check image; and providing a result of the at least one content detection test to a user. The ROI may correspond to an endorsement region on a back side of the check.


These and other features, aspects, and embodiments are described below in the section entitled “Detailed Description.”





BRIEF DESCRIPTION OF THE DRAWINGS

Features, aspects, and embodiments are described in conjunction with the attached drawings, in which:



FIG. 1 is a high level block diagram of a system for identifying content in regions of an image of a check captured by a mobile device, according to one exemplary embodiment;



FIG. 2 is a flow diagram of a method for identifying content in regions of an image of a check captured by a mobile device, according to one exemplary embodiment;



FIG. 3 is an image of a back side of a check depicting an endorsement region where an endorsement should be present, according to one exemplary embodiment;



FIG. 4 is an image of a front side of a check depicting a payee field which can be compared with an endorsement to determine the authenticity of the endorsement, according to one exemplary embodiment;



FIG. 5 is a method for testing the quality of a MICR (Magnetic Ink Character Recognition) line of the check image, according to an embodiment;



FIG. 6 is a method for testing the aspect ratios of front and back images of a check to test whether the images are of the same check, according to an embodiment; and



FIG. 7 is a block diagram of a computing device on which the mobile image quality assurance system described above can be implemented, according to an embodiment: and



FIG. 8 is a simplified block diagram illustrating an example-computing module in accordance with one embodiment.





DETAILED DESCRIPTION

The following detailed description is directed to certain specific embodiments. However, it will be understood that these embodiments are by way of example only and should not be seen as limiting the systems and methods described herein to the specific embodiments, architectures, etc. In this description, reference is made to the drawings wherein like parts are designated with like numerals throughout.


Systems and methods for processing an image of a check captured using a mobile device, such as a mobile phone, are provided. These techniques can be implemented on a mobile device and at a central server, and can be used to identify content on a check and determine whether the check is ready to be processed by a business or financial institution. The system can identify portions of the check—such as the endorsement area—to determine if the check has been properly endorsed. If the check lacks an endorsement, a real time notification can be provided to a user which uploaded the check image so the user can correctly endorse the check and upload a new check image. Additional portions of the check, including the signature line, the addressee field, etc. can be checked to ensure that the check is ready to be deposited by the bank.


Differentiating between characteristics of checks provides additional information to a bank being asked to deposit the check as to the potential risk of the check being fraudulent. The risk of fraudulent checks varies depending on the characteristics of the check, and so a bank can set up various customized rules. If the check has a characteristic that is commonly associated with fraud, the bank may immediately deny the request, or request additional processing of the check image before deciding whether to deposit the check. The user is sent a message if the deposit is denied, and may be provided with instructions to manually deposit the check so that the bank can review the original check.


Content Identification


FIG. 1 is a high level block diagram of a system for identifying content in regions of an image of a check captured by a mobile device, according to one exemplary embodiment. A Mobile Check Image Processing System (MCIPS) 100 is shown as a single unit, although one or more of the components of the MCIPS 100 may reside on a mobile device 102 or on an MCIPS server 104 connected with the mobile device through a network (not shown). In the embodiment illustrated in FIG. 1, a user of the mobile device 102 captures an image of a check (“check image”) 106 using an image capture device incorporated within the mobile device 102 or connected with the mobile device 102. The check image 106 may include more than one image file, as the user may need to submit an image of a front side of the check and a back side of the check in order for the check to be deposited with a bank. Also, in some cases more than one image may need to be taken of one side of the check in order to clearly capture the content of the check, so the check image 106 will therefore refer to the one or more images of the check which are submitted to the MCIPS 100. The check image 106 is transmitted from the mobile device 102 to a preprocessing unit 108 at the MCIPS server 104, where the check image 106 is processed to select a type of content to be identified and identify at least one region of interest (ROI) where that type of content would be present. The ROI will correspond to a particular area of the check that is known to contain specific content, such as an endorsement, a payor name, a payor address, bank account number, routing number, etc. Programs which execute particular algorithms to identify a specific ROI are stored in an ROI programs database 110 connected with the preprocessing unit 108. The preprocessing unit 108 selects one or more programs to run which identify one or more ROIs on the check image 106 that should contain the specific content needed to perform a mobile deposit process. In the embodiments described herein, the ROI may be an endorsement region on a back side of the check where an endorsement is found.


Once the ROIs are identified, the check image 106 is sent to a testing unit 112 to perform one or more content detection tests on the identified ROIs. In one embodiment, the preprocessing unit may send a sub-image of the check image 106 which contains only the identified ROIs in order to streamline the process by sending only the relevant ROIs that need to be tested. Sending the sub-image would be particularly beneficial if the preprocessing unit is located on the mobile device 102 and the testing unit is located on the MCIPS server 104, as the transmission of only the ROIs across a network would take less time. The testing unit 112 obtains content detection tests from a test and rules database 114 connected with the testing unit 112. The content detection tests are programs that identify specific content in the ROIs, such as an endorsement signature in an endorsement ROI. The content detection tests may identify the presence or absence of particular content or distinguish between one or more types of content in the ROI. For example, an endorsement content detection test may first identify whether an endorsement is present in the endorsement ROI, but will also determine the type of endorsement—such as a hand-written signature or a stamp.


In one embodiment, the content on the check can be manually entered by the user. The check content can be optionally provided by the user at the time that the check is captured. This check content can include various information from the check, such as the check amount, check number, routing information from the face of the check, or other information, or a combination thereof. In some embodiments, a mobile deposition application requests this information from a user of the mobile device, allows the user to capture an image of a check or to select an image of a check that has already been captured, or both, and the mobile deposit information provides the check image, the check content, and other processing parameters to the MCIPS 100.


Once the testing unit performs the appropriate content detection tests, the results of the tests are forwarded to a feedback unit 116 to analyze the results and provide appropriate feedback to the user or to a bank 118. If the endorsement content detection test determines that there is no endorsement on the check, the feedback unit 116 will generate a message to send back to the mobile device 102 telling the user that the endorsement is missing and perhaps requesting that the user endorse the check and upload a new image of the endorsed check. Or, if the endorsement is complete, the feedback unit 116 will generate and send a message to the mobile device 102 indicating that the check is properly endorsed and will be deposited in the user's account. The feedback unit 116 may also generate messages to send to the bank 118 which is receiving the deposited check. For example, the messages may indicate whether the check is ready to be deposited, if a problem has been detected which requires the bank to perform additional processing of the check image, or if the user needs to physically bring the check in.


The feedback unit 116 may generate messages by accessing a feedback message database 120 which stores the messages. The feedback message database 120 may also store rules for generating messages based on the results of the tests performed. In one embodiment, the bank 118 may configure the rules stored in the feedback message database 120 so that certain messages are sent to the bank or the user depending on the results of the tests performed.


The feedback unit 116 may also be configured to take a particular action that coincides with the message that is being delivered to the user or the bank. If an endorsement is not present on the check, the feedback unit 116 may also suspend the mobile deposit process and await a new check image from the user. Once the revised check image is received and the testing unit 112 returns a positive test confirming the presence of an endorsement, the feedback unit 116 reactivates the mobile deposit process and sends a message to the bank 118 that the check is ready to be deposited.


In certain embodiments, the mobile application can display information on the mobile device shortly after the user takes the mobile document image to allow the user to retake the image if the image is found to have defects that affect the overall status of the image. In some embodiments, where the MCIPS 100 is implemented at least in part on the mobile device, the MCIPS 100 can include a user interface module that is configured to display the test results message on a screen of the mobile device 102.



FIG. 1 merely provides a description of the logical components of the MCIPS 100. In some embodiments, the MCIPS 100 can be implemented on the mobile device, in software, hardware, or a combination thereof. In other embodiments, the MCIPS 100 can be implemented on a remote server (not shown), and the mobile device can send the check image 106 e.g., via a wireless interface, to the remote server for processing. The remote server sends the test results and test messages to the mobile device 102 to indicate whether the mobile image passed testing. In some embodiments, part of the functionality of the MCIPS 100 can be implemented on the mobile device 102 while other parts of the MCIPS 100 are implemented on the remote server. The MCIPS 100 can be implemented in software, hardware, or a combination thereof. In still other embodiments, the MCIPS 100 can be implemented entirely on the remote server, and can be implemented using appropriate software, hardware, or a combination thereof.



FIG. 2 is a flow diagram of a method for identifying content in at least one region of an image of a check captured by a mobile device, according to one exemplary embodiment. In step 205, the check image is received from the mobile device. In step 210, the type of content to be identified is selected—either a test to check for an endorsement or a test to determine other content. In one embodiment, both tests can be selected and run either simultaneously or in sequence.


If the endorsement test is selected (“Endorsement”), the next step is step 215 to identify a region of interest (ROI) where the endorsement would be present on the check. In many situations, the ROI is on the back of the check in an endorsement area (see FIG. 3). Next, in step 220, a content detection test is run on the selected ROI to determine if an endorsement is present. In step 225, the results of the test of whether the endorsement is present are used to determine the next step. If the endorsement is not present, then, in step 230, the user is notified that the check they are attempting to deposit has not been endorsed. This notification may also include instructions to correct the error. Once the user is notified, the mobile deposit process may be suspended (step 270) until another image is submitted. If the endorsement is present in the ROI, an endorsement comparison test may be executed at step 235, where the identified endorsement is compared with either a signature or stamp stored on the MCIPS server, or where the signature or stamp is compared with the name listed in a payee field (see FIG. 4) on a front area of the check to determine if the person or entity endorsing the check is the same person that the check is addressed to. In step 240, if the endorsement does not match the stored endorsement or the payee field, the user is notified (step 230) and the mobile deposit process may be suspended (step 270). If the endorsement does match the payee field name or the stored endorsement signature or stamp, the mobile deposit process may be completed in step 245.


If the “Other Type” test is selected at step 210, a region of interest (ROI) on the check image is identified in step 250 where the other type content would be indicated. In step 255, a content detection test is run on the identified ROI to determine characteristics of the other type. Once the characteristics of the other type have been identified, the characteristics are evaluated in step 260 based on processing rules established for the other type. For example, the processing rule could check for a proper signature by a payer.



FIG. 3 is an image of a back side 300 of a check, depicting an endorsement region 302 where an endorsement should be present, according to one exemplary embodiment.



FIG. 4 is an image of a front side 400 of a check, depicting a payee field 402 which can be used to compare with the identified endorsement in order to determine if the endorsement and the payee fields match.


According to some embodiments, the systems and methods for identifying content of a check can be integrated with one or more mobile applications that process mobile document images, such as mobile deposit applications for processing images of checks to be deposited into a bank account. According to an embodiment, the mobile application can run on the mobile device and can be integrated to use mobile image quality assurance functionality that is implemented on the mobile device and/or on a remote server. The mobile application can use the system for identifying content of a check to streamline the mobile deposit process.


MICR-Line Test

In one embodiment, an image of a check can be rejected if the MICR-line on the check cannot be clearly detected in the image. MICR stands for Magnetic Ink Character Recognition. Information can be printed on the front of a check in a special typeface using a special magnetized ink. Therefore, the MICR-line test is useful to determine which side of the check is in a check image before a region of interest (ROI) is selected. A MICR IQA test can use optical character recognition techniques to identify the MICR information on a check. If the MICR line on the front of the check is damaged, simply retaking an image of the check will not correct the defects in the image and the image will be rejected; however, if the MICR line was merely blurry or unreadable due to one or more of the factors described above, retaking the image after correcting one or more of factors may result in a high-enough quality image that the MICR line can be read from the check.



FIG. 5 illustrates a method for testing the quality of a MICR (Magnetic Ink Character Recognition) line of the check image, according to an embodiment. The MICR-line Test is used to determine whether a high quality image of a check front has been captured using the mobile device according to an embodiment. The MICR-line Test can be used in conjunction with a Mobile Deposit application to ensure that images of checks captures for processing with the Mobile Deposit information are of a high enough quality to be processed so that the check can be electronically deposited. Furthermore, if a mobile image fails the MICR-line Test, the failure may be indicative of incorrect subimage detections and/or poor overall quality of the mobile image, and such an image should be rejected anyway.



FIG. 5 is a flow chart of a method for executing a MICR-line Test according to an embodiment. A mobile image is received (step 505) and a bitonal image is generated from the mobile image (step 510). In an embodiment, preprocessing unit 108 extracts the document subimage from the check image as described above, including preprocessing such as geometric correction. The extracted subimage can then be converted to a bitonal snippet by the preprocessing unit 108. The MICR line is then identified in the bitonal snippet (step 515). According to an embodiment, a MICR recognition engine is then applied to identify the MICR-line and to compute character-level and overall confidence values for the image (step 520). These confidences can then be normalized to the 0-1000 scale used by the mobile IQA tests where 1000 means high quality and 0 means poor MICR quality. The confidence level is then returned (step 525). As described above, the test result value is provided to the testing unit 112 where the test result value can be compared to a threshold value associated with the test. If the test result value falls below the threshold associated with the test, detailed test result messages can be retrieved from the feedback and rules database 120 and provided to the user via the feedback unit 116 to indicate why the test failed and what might be done to remedy the test. For example, the user may simply need to retake the image to adjust for geometrical or other factors, such as poor lighting or a shadowed document. In some instances, the user may not be able to correct the errors. For example, if the MICR line on the document is damaged or incomplete and the document will continue to fail the test even if the image were retaken.


Front-as-Rear Test


FIG. 6 is a method for testing the aspect ratios of front and back images of a check to test whether the images are of the same check, according to an embodiment. A Front-as-Rear Test can be used to determine whether an image that is purported to be the back of a check is actually an image of the front of the check according to an embodiment. The Front-as-Rear Test is a check specific Boolean test. The test returns a value of 0 if an image fails the test and a value of 1000 if an image passes the test. If an MICR-line is identified on what is purported to be an image of the back of the check, the image will fail the test and generate a test message that indicates that the images of the check have been rejected because an image of the front of the check was mistakenly passed as an image of the rear of the check.


An image of the rear of the check is received (step 605) and the image is converted to a bitonal snippet by preprocessing unit 108 of the MCIPS 100 (step 610). A MICR recognition engine is then applied to identify a MICR-line in the bitonal snippet (step 615). The results from the MICR recognition engine can then be normalized to the 0-1000 scale used by the mobile IQA tests, and the normalized value compared to a threshold value associated with the test (step 620). According to an embodiment, the test threshold can be provided as a parameter to the test along with the with mobile document image to be tested. According to an embodiment, the threshold used for this test is lower than the threshold used in the MICR-line Test described above.


If the normalized test result equals or exceeds the threshold, then the image includes an MICR-line and the test is marked as failed (test result value=0), because a MICR line was identified in what was purported to be an image of the back of the check. If the normalized test result is less than the threshold, the image did not include a MICR line and the test is marked as passed (test result value=1000). The test results value is then returned (step 625).


Computer-Implemented Embodiment


FIG. 7 is an exemplary embodiment of a mobile device 700 than can be used to implement the mobile IQA system according to an embodiment. Mobile device 700 includes a processor 710. The processor 710 can be a microprocessor or the like that is configurable to execute program instructions stored in the memory 720 and/or the data storage 740. The memory 720 is a computer-readable memory that can be used to store data and or computer program instructions that can be executed by the processor 710. According to an embodiment, the memory 720 can comprise volatile memory, such as RAM and/or persistent memory, such as flash memory. The data storage 740 is a computer readable storage medium that can be used to store data and or computer program instructions. The data storage 740 can be a hard drive, flash memory, a SD card, and/or other types of data storage.


The mobile device 700 also includes an image capture component 730, such as a digital camera. According to some embodiments, the mobile device 700 is a mobile phone, a smart phone, or a PDA, and the image capture component 730 is an integrated digital camera that can include various features, such as auto-focus and/or optical and/or digital zoom. In an embodiment, the image capture component 730 can capture image data and store the data in memory 720 and/or data storage 740 of the mobile device 700.


Wireless interface 750 of the mobile device can be used to send and/or receive data across a wireless network. For example, the wireless network can be a wireless LAN, a mobile phone carrier's network, and/or other types of wireless network.


I/O interface 760 can also be included in the mobile device to allow the mobile device to exchange data with peripherals such as a personal computer system. For example, the mobile device might include a USB interface that allows the mobile to be connected to USB port of a personal computer system in order to transfers information such as contact information to and from the mobile device and/or to transfer image data captured by the image capture component 730 to the personal computer system.


Those of skill in the art will appreciate that the various illustrative modules, components, engines, and method steps described in connection with the above described figures and the embodiments disclosed herein can often be implemented as electronic hardware, software, firmware or combinations of the foregoing. To clearly illustrate this interchangeability of hardware and software, various illustrative modules and method steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled persons can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the invention. In addition, the grouping of functions within a module or step is for ease of description. Specific functions can be moved from one module or step to another without departing from the invention.


Moreover, the various illustrative modules, components, modules, engines, and method steps described in connection with the embodiments disclosed herein can be implemented or performed with hardware such as a general purpose processor, a digital signal processor (“DSP”), an application specific integrated circuit (“ASIC”), field programmable gate array (“FPGA”) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor is hardware and can be a microprocessor, but in the alternative, the processor can be any hardware processor or controller, microcontroller. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.


Additionally, the steps of a method or algorithm and the functionality of a component, engine, or module described in connection with the embodiments disclosed herein can be embodied directly in hardware, in software executed by a processor, or in a combination of the two. Software can reside in computer or controller accessible computer-readable storage media including RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium including a network storage medium. An exemplary storage medium can be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can also reside in an ASIC.


Referring now to FIG. 9, computing module 1900 that can be sued to implement the systems and methods described herein and may represent, for example, computing or processing capabilities found within desktop, laptop and notebook computers; mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment. Computing module 1900 might also represent computing capabilities embedded within or otherwise available to a given device. For example, a computing module might be found in other electronic devices. Computing module 1900 might include, for example, one or more processors or processing devices, such as a processor 1904. Processor 1904 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic.


Computing module 1900 might also include one or more memory modules, referred to as main memory 1908. For example, random access memory (RAM) or other dynamic memory might be used for storing information and instructions to be executed by processor 1904. Main memory 1908 might also be used for storing temporary variables or other intermediate information during execution of instructions by processor 1904. Computing module 1900 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 1902 for storing static information and instructions for processor 1904.


The computing module 1900 might also include one or more various forms of information storage mechanism 1910, which might include, for example, a media drive 1912 and a storage unit interface 1920. The media drive 1912 might include a drive or other mechanism to support fixed or removable storage media 1914. For example, a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. Accordingly, storage media 1914 might include, for example, a hard disk, a floppy disk, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed by media drive 1912. As these examples illustrate, the storage media 1914 can include a computer usable storage medium having stored therein particular computer software or data.


In alternative embodiments, information storage mechanism 1910 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 1900. Such instrumentalities might include, for example, a fixed or removable storage unit 1922 and an interface 1920. Examples of such storage units 1922 and interfaces 1920 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 1922 and interfaces 1920 that allow software and data to be transferred from the storage unit 1922 to computing module 1900.


Computing module 1900 might also include a communications interface 1924. Communications interface 1924 might be used to allow software and data to be transferred between computing module 1900 and external devices. Examples of communications interface 1924 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface. Software and data transferred via communications interface 1924 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a given communications interface 1924. These signals might be provided to communications interface 1924 via a channel 1928. This channel 1928 might carry signals and might be implemented using a wired or wireless communication medium. These signals can deliver the software and data from memory or other storage medium in one computing system to memory or other storage medium in computing system 1900. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels.


Computing module 1900 might also include a communications interface 1924. Communications interface 1924 might be used to allow software and data to be transferred between computing module 1900 and external devices. Examples of communications interface 1924 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMAX, 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port, Bluetooth interface, or other port), or other communications interface. Software and data transferred via communications interface 1924 might typically be carried on signals, which can be electronic, electromagnetic, optical or other signals capable of being exchanged by a given communications interface 1924. These signals might be provided to communications interface 1924 via a channel 1928. This channel 1928 might carry signals and might be implemented using a wired or wireless medium. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels.


In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to physical storage media such as, for example, memory 1908, storage unit 1920, and media 1914. These and other various forms of computer program media or computer usable media may be involved in storing one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium, are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable the computing module 1900 to perform features or functions of the present invention as discussed herein.


While certain embodiments have been described above, it will be understood that the embodiments described are by way of example only. Accordingly, the systems and methods described herein should not be limited based on the described embodiments. Rather, the systems and methods described herein should only be limited in light of the claims that follow when taken in conjunction with the above description and accompanying drawings.

Claims
  • 1. A method comprising: by at least one hardware processor of a mobile device, receiving at least one image of a check,identifying at least one region of interest in the at least one image of the check,extracting the identified at least one region of interest from the at least one image of the check, andtransmitting the extracted at least one region of interest to a server over at least one network without sending the at least one image of the check to the server; andby at least one hardware processor of the server, determining whether or not the at least one region of interest contains a specific type of content,when the at least one region of interest does contain the specific type of content, initiating a check deposit process, and,when the at least one region of interest does not contain the specific type of content, alerting a user without initiating the check deposit process.
  • 2. The method of claim 1, wherein alerting the user comprises: by the at least one hardware processor of the server, transmitting a notification to the mobile device over the at least one network; and,by the at least one hardware processor of the mobile device, alerting the user via a graphical user interface displayed on a display of the mobile device.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/742,439, filed on Jan. 14, 2020, which is a continuation of U.S. patent application Ser. No. 15/838,088, filed on Dec. 11, 2017, now U.S. Pat. No. 10,558,972, which is a continuation of U.S. patent application Ser. No. 13/488,349, filed on Jun. 4, 2012, now U.S. Pat. No. 9,842,331, which is a continuation-in-part of U.S. patent application Ser. No. 12/906,036, filed on Oct. 15, 2010, now U.S. Pat. No. 8,577,118, which is a continuation-in-part of U.S. patent application Ser. No. 12/778,943, filed on filed May 12, 2010, now U.S. Pat. No. 8,582,862. U.S. patent application Ser. No. 13/488,349 is also a continuation-in-part of U.S. patent application Ser. No. 12/346,026, filed on Dec. 30, 2008, now U.S. Pat. No. 7,978,900, which claims the benefit of U.S. Provisional Patent Application No. 61/022,279, filed on Jan. 18, 2008. U.S. patent application Ser. No. 13/488,349 is also a continuation-in-part of U.S. patent application Ser. No. 12/778,943, filed on May 12, 2010, now U.S. Pat. No. 8,582,862, all of which are hereby incorporated by reference in their entirety.

US Referenced Citations (384)
Number Name Date Kind
4311914 Huber Jan 1982 A
5326959 Perazza Jul 1994 A
5600732 Ott et al. Feb 1997 A
5751841 Leong et al. May 1998 A
5761686 Bloomberg Jun 1998 A
5920847 Kolling et al. Jul 1999 A
5966473 Takahashi Oct 1999 A
5999636 Juang Dec 1999 A
6038351 Rigakos Mar 2000 A
6038553 Hyde, Jr. Mar 2000 A
6070150 Remington et al. May 2000 A
6125362 Elworthy Sep 2000 A
6282326 Lee et al. Aug 2001 B1
6304684 Niczyporuk et al. Oct 2001 B1
6345130 Dahl Feb 2002 B1
6408094 Mirzaoff et al. Jun 2002 B1
6516078 Yang et al. Feb 2003 B1
6621919 Mennie et al. Sep 2003 B2
6735341 Horie et al. May 2004 B1
6807294 Yamazaki Oct 2004 B2
6947610 Sun Sep 2005 B2
6985631 Zhang Jan 2006 B2
6993205 Lorie et al. Jan 2006 B1
7020320 Filatov Mar 2006 B2
7072862 Wilson Jul 2006 B1
7133558 Ohara et al. Nov 2006 B1
7245765 Myers et al. Jul 2007 B2
7283656 Blake et al. Oct 2007 B2
7301564 Fan Nov 2007 B2
7331523 Meier et al. Feb 2008 B2
7376258 Klein et al. May 2008 B2
7377425 Ma et al. May 2008 B1
7426316 Vehvilainen Sep 2008 B2
7433098 Klein et al. Oct 2008 B2
7478066 Remington et al. Jan 2009 B2
7548641 Gilson et al. Jun 2009 B2
7558418 Verma et al. Jul 2009 B2
7584128 Mason et al. Sep 2009 B2
7593595 Heaney, Jr. et al. Sep 2009 B2
7606741 King et al. Oct 2009 B2
7636483 Yamaguchi et al. Dec 2009 B2
7735721 Ma Jun 2010 B1
7778457 Nepomniachtchi et al. Aug 2010 B2
7793831 Beskitt Sep 2010 B2
7793835 Coggeshall et al. Sep 2010 B1
7817854 Taylor Oct 2010 B2
7869098 Corso et al. Jan 2011 B2
7873200 Oakes, III Jan 2011 B1
7876949 Oakes, III Jan 2011 B1
7949176 Nepomniachtchi May 2011 B2
7950698 Popadic et al. May 2011 B2
7953268 Nepomniachtchi May 2011 B2
7982770 Kahn et al. May 2011 B1
7974899 Prasad et al. Jul 2011 B1
7978900 Nepomniachtchi et al. Jul 2011 B2
7983468 Ibikunle et al. Jul 2011 B2
7986346 Kaneda et al. Jul 2011 B2
7995196 Fraser Aug 2011 B1
7996317 Gurz Aug 2011 B1
8000514 Nepomniachtchi et al. Aug 2011 B2
8023155 Jiang Sep 2011 B2
8025226 Hopkins, III et al. Sep 2011 B1
8109436 Hopkins, III Feb 2012 B1
8118216 Hoch et al. Feb 2012 B2
8121948 Gustin et al. Feb 2012 B2
8126252 Abernethy et al. Feb 2012 B2
8160149 Demos Apr 2012 B2
8180137 Faulkner et al. May 2012 B2
8233714 Zuev et al. Jul 2012 B2
8238638 Mueller et al. Aug 2012 B2
8290237 Burks et al. Oct 2012 B1
8300917 Borgia et al. Oct 2012 B2
8320657 Burks et al. Nov 2012 B1
8326015 Nepomniachtchi Dec 2012 B2
8339642 Ono Dec 2012 B2
8340452 Marchesotti Dec 2012 B2
8358826 Medina, III Jan 2013 B1
8370254 Hopkins, III et al. Feb 2013 B1
8374383 Long et al. Feb 2013 B2
8379914 Nepomniachtchi et al. Feb 2013 B2
8442844 Trandal et al. May 2013 B1
8532419 Coleman Sep 2013 B2
8538124 Harpel Sep 2013 B1
8540158 Lei et al. Sep 2013 B2
8542921 Medina Sep 2013 B1
8559766 Tilt et al. Oct 2013 B2
8582862 Nepomniachtchi et al. Nov 2013 B2
8688579 Ethington et al. Apr 2014 B1
8699779 Prasad et al. Apr 2014 B1
8837833 Wang et al. Sep 2014 B1
8861883 Tanaka Oct 2014 B2
8879783 Wang et al. Nov 2014 B1
8959033 Oakes, III Feb 2015 B1
8977571 Bueche, Jr. et al. Mar 2015 B1
9058512 Medina, III Jun 2015 B1
9208393 Kotovich et al. Dec 2015 B2
9460141 Coman Oct 2016 B1
9613258 Chen et al. Apr 2017 B2
9679214 Kotovich et al. Jun 2017 B2
9710702 Nepomniachtchi et al. Jul 2017 B2
9773186 Nepomniachtchi et al. Sep 2017 B2
9786011 Engelhorn et al. Oct 2017 B1
9842331 Nepomniachtchi Dec 2017 B2
10095947 Nepomniachtchi et al. Oct 2018 B2
10102583 Strange Oct 2018 B2
10275673 Kotovich et al. Apr 2019 B2
10360447 Nepomniachtchi et al. Jul 2019 B2
10373136 Pollack et al. Aug 2019 B1
10452908 Ramanathan et al. Oct 2019 B1
10546206 Nepomniachtchi et al. Jan 2020 B2
10621660 Medina et al. Apr 2020 B1
10789496 Kotovich et al. Sep 2020 B2
10789501 Nepomniachtchi et al. Sep 2020 B2
10891475 Nepomniachtchi et al. Jan 2021 B2
10909362 Nepomniachtchi et al. Feb 2021 B2
11157731 Nepomniachtchi et al. Oct 2021 B2
11380113 Nepomniachtchi et al. Jul 2022 B2
11393272 Kriegsfeld et al. Jul 2022 B2
20010014183 Sansom-Wai et al. Aug 2001 A1
20010016084 Pollard et al. Aug 2001 A1
20010019334 Carrai et al. Sep 2001 A1
20010019664 Pilu Sep 2001 A1
20010044899 Levy Nov 2001 A1
20020003896 Yamazaki Jan 2002 A1
20020012462 Fujiwara Jan 2002 A1
20020023055 Antognini et al. Feb 2002 A1
20020037097 Hoyos et al. Mar 2002 A1
20020041717 Murata et al. Apr 2002 A1
20020044689 Roustaei et al. Apr 2002 A1
20020046341 Kazaks et al. Apr 2002 A1
20020067846 Foley Jun 2002 A1
20020073044 Singhal Jun 2002 A1
20020077976 Meyer et al. Jun 2002 A1
20020080013 Anderson, III et al. Jun 2002 A1
20020085745 Jones et al. Jul 2002 A1
20020120846 Stewart et al. Aug 2002 A1
20020128967 Meyer et al. Sep 2002 A1
20020138351 Houvener et al. Sep 2002 A1
20020143804 Dowdy Oct 2002 A1
20020150279 Scott et al. Oct 2002 A1
20030009420 Jones Jan 2003 A1
20030072568 Lin et al. Apr 2003 A1
20030086615 Dance et al. May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099401 Driggs et al. May 2003 A1
20030156201 Zhang Aug 2003 A1
20030161523 Moon et al. Aug 2003 A1
20030177100 Filatov Sep 2003 A1
20040012679 Fan Jan 2004 A1
20040017947 Yang Jan 2004 A1
20040024769 Forman et al. Feb 2004 A1
20040037448 Brundage Feb 2004 A1
20040081332 Tuttle et al. Apr 2004 A1
20040109597 Lugg Jun 2004 A1
20040205474 Eskin et al. Oct 2004 A1
20040213434 Emerson et al. Oct 2004 A1
20040213437 Howard et al. Oct 2004 A1
20040218799 Mastie et al. Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040236690 Bogosian et al. Nov 2004 A1
20040247168 Pintsov et al. Dec 2004 A1
20050011957 Attia et al. Jan 2005 A1
20050065893 Josephson Mar 2005 A1
20050071283 Randle et al. Mar 2005 A1
20050080698 Perg et al. Apr 2005 A1
20050091161 Gustin et al. Apr 2005 A1
20050097046 Singfield May 2005 A1
20050100216 Myers et al. May 2005 A1
20050125295 Tidwell et al. Jun 2005 A1
20050129300 Sanidson, I et al. Jun 2005 A1
20050141028 Koppich Jun 2005 A1
20050143136 Lev et al. Jun 2005 A1
20050163362 Jones et al. Jul 2005 A1
20050180661 El Bernoussi et al. Aug 2005 A1
20050192897 Rogers et al. Sep 2005 A1
20050196069 Yonaha Sep 2005 A1
20050196071 Prakash et al. Sep 2005 A1
20050213805 Blake et al. Sep 2005 A1
20050219367 Kanda et al. Oct 2005 A1
20050220324 Klein et al. Oct 2005 A1
20050229010 Monk et al. Oct 2005 A1
20050242186 Ohbuchi Nov 2005 A1
20050261990 Gocht et al. Nov 2005 A1
20060008167 Yu et al. Jan 2006 A1
20060008267 Kim Jan 2006 A1
20060012699 Miki Jan 2006 A1
20060039629 Li et al. Feb 2006 A1
20060045322 Clarke et al. Mar 2006 A1
20060045342 Kim et al. Mar 2006 A1
20060045344 Paxton et al. Mar 2006 A1
20060045379 Heaney et al. Mar 2006 A1
20060071950 Kurzweil et al. Apr 2006 A1
20060072822 Hatzav et al. Apr 2006 A1
20060088214 Handley et al. Apr 2006 A1
20060106717 Randle et al. May 2006 A1
20060140504 Fujimoto et al. Jun 2006 A1
20060164682 Lev Jul 2006 A1
20060177118 Ibikunle Aug 2006 A1
20060182331 Gilson et al. Aug 2006 A1
20060186194 Richardson et al. Aug 2006 A1
20060210192 Orhun Sep 2006 A1
20060221415 Kawamoto Oct 2006 A1
20060242063 Peterson et al. Oct 2006 A1
20060280354 Murray Dec 2006 A1
20060291727 Bargeron Dec 2006 A1
20070009155 Potts et al. Jan 2007 A1
20070053574 Verma et al. Mar 2007 A1
20070058851 Quine et al. Mar 2007 A1
20070064991 Douglas et al. Mar 2007 A1
20070071324 Thakur Mar 2007 A1
20070076940 Goodall et al. Apr 2007 A1
20070081796 Fredlund et al. Apr 2007 A1
20070084911 Crowell Apr 2007 A1
20070086642 Foth et al. Apr 2007 A1
20070086643 Spier Apr 2007 A1
20070110277 Hayduchok et al. May 2007 A1
20070114785 Porter May 2007 A1
20070118391 Malaney et al. May 2007 A1
20070131759 Cox et al. Jun 2007 A1
20070140678 Yost et al. Jun 2007 A1
20070154071 Lin Jul 2007 A1
20070156438 Popadic et al. Jul 2007 A1
20070168382 Tillberg Jul 2007 A1
20070171288 Inoue et al. Jul 2007 A1
20070174214 Welsh et al. Jul 2007 A1
20070195174 Oren Aug 2007 A1
20070206877 Wu et al. Sep 2007 A1
20070211964 Agam Sep 2007 A1
20070214078 Coppinger Sep 2007 A1
20070244782 Chimento Oct 2007 A1
20070265887 Mclaughlin et al. Nov 2007 A1
20070288382 Narayanan et al. Dec 2007 A1
20070297664 Blaikie Dec 2007 A1
20080010215 Rackley, III et al. Jan 2008 A1
20080031543 Nakajima et al. Feb 2008 A1
20080040259 Snow et al. Feb 2008 A1
20080040280 Davis et al. Feb 2008 A1
20080062437 Rizzo Mar 2008 A1
20080086420 Gilder et al. Apr 2008 A1
20080089573 Mori et al. Apr 2008 A1
20080128505 Challa et al. Jun 2008 A1
20080152238 Sarkar Jun 2008 A1
20080174815 Komaki Jul 2008 A1
20080183576 Kim et al. Jul 2008 A1
20080192129 Walker et al. Aug 2008 A1
20080193020 Sibiryakov et al. Aug 2008 A1
20080212901 Castiglia et al. Sep 2008 A1
20080231714 Estevez et al. Sep 2008 A1
20080235263 Riaz et al. Sep 2008 A1
20080247629 Gilder et al. Oct 2008 A1
20080249931 Gilder et al. Oct 2008 A1
20080249936 Miller et al. Oct 2008 A1
20080267510 Paul et al. Oct 2008 A1
20080306787 Hamilton et al. Dec 2008 A1
20090041377 Edgar Feb 2009 A1
20090063431 Erol et al. Mar 2009 A1
20090092322 Erol et al. Apr 2009 A1
20090108080 Meyer et al. Apr 2009 A1
20090114716 Ramachandran May 2009 A1
20090125510 Graham et al. May 2009 A1
20090141962 Borgia et al. Jun 2009 A1
20090159659 Norris et al. Jun 2009 A1
20090185241 Nepomniachtchi Jul 2009 A1
20090185736 Nepomniachtchi Jul 2009 A1
20090185737 Nepomniachtchi Jul 2009 A1
20090185738 Nepomniachtchi Jul 2009 A1
20090185752 Dwivedula et al. Jul 2009 A1
20090190830 Hasegawa Jul 2009 A1
20090196485 Mueller et al. Aug 2009 A1
20090198493 Hakkani-Tur et al. Aug 2009 A1
20090201541 Neogi et al. Aug 2009 A1
20090216672 Zulf Aug 2009 A1
20090261158 Lawson Oct 2009 A1
20090265134 Sambasivan et al. Oct 2009 A1
20090271287 Halpern Oct 2009 A1
20090285444 Erol et al. Nov 2009 A1
20100030524 Warren Feb 2010 A1
20100037059 Sun et al. Feb 2010 A1
20100038839 Dewitt et al. Feb 2010 A1
20100073735 Hunt et al. Mar 2010 A1
20100074547 Yu et al. Mar 2010 A1
20100080471 Haas et al. Apr 2010 A1
20100082470 Walach et al. Apr 2010 A1
20100102119 Gustin Apr 2010 A1
20100104171 Faulkner et al. Apr 2010 A1
20100114765 Gustin May 2010 A1
20100114766 Gustin May 2010 A1
20100114771 Gustin May 2010 A1
20100114772 Gustin May 2010 A1
20100150424 Nepomniachtchi et al. Jun 2010 A1
20100161466 Gilder Jun 2010 A1
20100200660 Moed Aug 2010 A1
20100208282 Isaev Aug 2010 A1
20100239160 Enomoto et al. Sep 2010 A1
20100246972 Koyama et al. Sep 2010 A1
20100253787 Grant Oct 2010 A1
20100254604 Prabhakara et al. Oct 2010 A1
20100284611 Lee et al. Nov 2010 A1
20110013822 Meek Jan 2011 A1
20110026810 Hu Feb 2011 A1
20110052065 Nepomniachtchi et al. Mar 2011 A1
20110075936 Deaver Mar 2011 A1
20110081051 Tayal et al. Apr 2011 A1
20110091092 Nepomniachtchi Apr 2011 A1
20110134248 Heit et al. Jun 2011 A1
20110170740 Coleman Jul 2011 A1
20110188759 Filimonova et al. Aug 2011 A1
20110194750 Nepomniachtchi Aug 2011 A1
20110249905 Singh et al. Oct 2011 A1
20110255795 Nakamura Oct 2011 A1
20110280450 Nepomniachtchi et al. Nov 2011 A1
20110289028 Sato Nov 2011 A1
20120010885 Hakkani-Tr et al. Jan 2012 A1
20120023567 Hammad Jan 2012 A1
20120030104 Huff Feb 2012 A1
20120033892 Blenkhorn et al. Feb 2012 A1
20120051649 Saund et al. Mar 2012 A1
20120070062 Houle Mar 2012 A1
20120072859 Wang et al. Mar 2012 A1
20120086989 Collins et al. Apr 2012 A1
20120106802 Hsieh et al. May 2012 A1
20120109792 Eftekhari et al. May 2012 A1
20120113489 Heit May 2012 A1
20120150773 Dicorpo et al. Jun 2012 A1
20120197640 Hakkani-Tr et al. Aug 2012 A1
20120201416 Dewitt et al. Aug 2012 A1
20120226600 Dolev Sep 2012 A1
20120230577 Calman Sep 2012 A1
20120265655 Stroh Oct 2012 A1
20120278336 Malik et al. Nov 2012 A1
20120308139 Dhir Dec 2012 A1
20130004076 Koo et al. Jan 2013 A1
20130022231 Nepomniachtchi et al. Jan 2013 A1
20130051610 Roach et al. Feb 2013 A1
20130058531 Hedley et al. Mar 2013 A1
20130085935 Nepomniachtchi et al. Apr 2013 A1
20130120595 Roach et al. May 2013 A1
20130148862 Roach et al. Jun 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130181054 Durham et al. Jul 2013 A1
20130182002 Macciola et al. Jul 2013 A1
20130182951 Shustorovich et al. Jul 2013 A1
20130182973 Macciola et al. Jul 2013 A1
20130202185 Irwin, Jr. et al. Aug 2013 A1
20130204777 Irwin, Jr. et al. Aug 2013 A1
20130223721 Nepomniachtchi et al. Aug 2013 A1
20130272607 Chattopadhyay et al. Oct 2013 A1
20130297353 Strange et al. Nov 2013 A1
20130311362 Milam et al. Nov 2013 A1
20130317865 Tofte et al. Nov 2013 A1
20130325706 Wilson et al. Dec 2013 A1
20140032406 Roach et al. Jan 2014 A1
20140037183 Gorski et al. Feb 2014 A1
20140040141 Gauvin Feb 2014 A1
20140044303 Chakraborti Feb 2014 A1
20140046841 Gauvin et al. Feb 2014 A1
20140064621 Reese et al. Mar 2014 A1
20140108456 Ramachandrula et al. Apr 2014 A1
20140126790 Duchesne et al. May 2014 A1
20140133767 Lund et al. May 2014 A1
20140172467 He et al. Jun 2014 A1
20140188715 Barlok et al. Jul 2014 A1
20140233837 Sandoz et al. Aug 2014 A1
20140254887 Amtrup et al. Sep 2014 A1
20140258838 Evers et al. Sep 2014 A1
20140270540 Spector et al. Sep 2014 A1
20140281871 Brunner et al. Sep 2014 A1
20140307959 Filimonova et al. Oct 2014 A1
20150012382 Ceribelli et al. Jan 2015 A1
20150012442 Ceribelli et al. Jan 2015 A1
20150040001 Kannan et al. Feb 2015 A1
20150142545 Ceribelli et al. May 2015 A1
20150142643 Ceribelli et al. May 2015 A1
20150334184 Liverance Nov 2015 A1
20160092730 Smirnov et al. Mar 2016 A1
20170185972 Bozeman Jun 2017 A1
20170316263 Nepomniachtchi et al. Nov 2017 A1
20180101751 Ghosh et al. Apr 2018 A1
20180101836 Nepomniachtchi et al. Apr 2018 A1
20180240081 Doyle et al. Aug 2018 A1
20200304650 Roach et al. Sep 2020 A1
20200342248 Nepomniachtchi et al. Oct 2020 A1
20210090372 Kriegsfeld et al. Mar 2021 A1
20220351161 Roach et al. Nov 2022 A1
Foreign Referenced Citations (8)
Number Date Country
2773730 Apr 2012 CA
20040076131 Aug 2004 KR
2007-12-06 Dec 2007 KR
20070115834 Dec 2007 KR
03069425 Aug 2003 WO
2006075967 Jul 2006 WO
2006136958 Dec 2006 WO
2012144957 Oct 2012 WO
Non-Patent Literature Citations (32)
Entry
“OCR: The Most Important Scanning Feature You Never Knew You Needed.” hp (blog), Feb. 24, 2012. Accessed May 13, 2015. http://h71036.www7.hp.com/hho/cache/608037-0-0-39-121.html.
Abdulkader et al. “Low Cost Correction of OCR Errors Using Learning in a Multi-Engine Environment.” Proceedings of the 10th International Conference on Document Analysis and Recognition (ICDAR '09). IEEE Computer Society, Washington, D.C., USA. pp. 576-580. http://dx.doi.org/10.1109/ICDAR.2009.24.
Bassil, Youssef. “OCR Post-Processing Error Correction Algorithm Using Google's Online Spelling Suggestion.” Journal of Emergin Trends in Computing and Information Sciences 3, No. 1 (Jan. 2012): 1. Accessed May 13, 2015. http://arxiv.org/ftp/arxiv/papers/1204/1204.0191.pdf.
Bienieki et al. “Image preprocessing for improving OCR accuracy.” Perspective Technologies and Methods in MEMS Design, 2007. International Conference on MEMSTECH 2007. IEEE, 2007.
Chattopadhyay et al. “On the Enhancement and Binarization of Mobile Captured Vehicle Identification Number for an Embedded Solution.” 10th IAPR International Workshop on Document Analysis Systems (DAS), 2012. pp. 235-239. Mar. 27-29, 2012.
Cook, John. “Three Algorithms for Converting Color to Grayscale.” Singular Value Consulting. Aug. 24, 2009. Accessed May 13, 2015. http://www.johndcook.com/blog/2009/08/24/algorithms-convert-color-grayscale/.
Gatos et al. “Improved Document Image Binarization by Using a Combination of Multiple Binarization Techniques and Adapted Edge Information.” 19th International Conference on Pattern Recognition, 2008. IEEE.
He et al., “Comer deterctor Based on Global and Local Curvature Properties ”Optical Engineering 47(5), 0570008 (2008).
International Search Report and Written Opinion received in PCT/US2011/056593, mailed May 30, 2012, 9 pages.
Notice of Allowance dated Feb. 22, 2023 received in U.S. Appl. No. 17/236,373 in 30 pages.
Notice of Allowance for related U.S. Appl. No. 16/160,796, mailed on Jan. 22, 2021, in 17 pages.
Notice of Allowance for related U.S. Appl. No. 16/579,625, mailed on Jan. 13, 2020 in 27 pages.
Notice of Allowance for related U.S. Appl. No. 17/829,025, mailed on Apr. 11, 2023, in 13 pages.
Office Action dated Feb. 1, 2023 in related U.S. Appl. No. 16/987,782, in 104 pages.
Office Action dated Jan. 12, 2023 in related U.S. Appl. No. 17/479,904, in 34 pages.
Office Action dated Sep. 25, 2019 for related U.S. Appl. No. 16/518,815, in 10 pages.
Office Action for related CA Patent Application No. 2,773,730, dated Aug. 21, 2017, in 4 pages.
Office Action for related U.S. Appl. No. 16/259,896, mailed on Dec. 12, 2019, in 22 pages.
Office Action for related U.S. Appl. No. 17/983,785, mailed on Mar. 30, 2023, in 46 pages.
Relativity. “Searching Manual.” Aug. 27, 2010. Accessed May 13, 2015. http://www.inventus.com/wp-content/uploads/2010/09/Relativity-Searching-Manual-6.6.pdf.
Shah et al. “OCR-Based chassis-No. recognition using artificial neural networks.” 2009 IEEE Conference on Vehicular Electronics and Safety. pp. 31-31. Nov. 11-12, 2009.
Stevens. “Advanced Programming in the UNIX Enrivonment.” Addison-Wesley Publishing Company, pp. 195-196 (1992).
“Tokenworks Introduces IDWedge ID Scanner Solution.” 2008.
Junker et al. “Evaluating OCR and Non-OCR Text Representation for Learning Document Classifiers.” Proceedings of the 4th International Conference on Document Analysis and Recognition. Ulm, Germany. Aug. 18-20, 1997. p. 1606-1066 (1997). Accessed http://citeseerxist.psu.eduviewdoc/download?doi=10.1.1.6.6732&rep=rep1-&type=pdf.
International Search Report issued in related International Application No. PCT/US2011/056593 dated May 30, 2012 (3 pages).
Office Action dated Jan. 9, 2020 for related U.S. Appl. No. 16/397,728 in 56 pages.
Office Action dated Jul. 11, 2019 for related U.S. Appl. No. 15/614,456 in 45 pages.
Office Action dated Jul. 26, 2019 for related U.S. Appl. No. 16/282,250 in 21 pages.
Office Action dated Mar. 20, 2020 in related U.S. Appl. No. 16/282,250, in 20 pages.
Office Action dated May 27, 2020 for related U.S. Appl. No. 16/282,250 in 18 pages.
Notice of Allowance for related U.S. Appl. No. 16/742,439, mailed on Sep. 18, 2020, in 39 pages.
PDF417, Wikipedia: the free encyclopedia, Oct. 21, 2008, https://en.wikipedia.org/w/index.php?title=PDF417&oldid=246681430 (Year: 2008), 3 pages.
Related Publications (1)
Number Date Country
20210073786 A1 Mar 2021 US
Provisional Applications (1)
Number Date Country
61022279 Jan 2008 US
Continuations (3)
Number Date Country
Parent 16742439 Jan 2020 US
Child 16952886 US
Parent 15838088 Dec 2017 US
Child 16742439 US
Parent 13488349 Jun 2012 US
Child 15838088 US
Continuation in Parts (4)
Number Date Country
Parent 12906036 Oct 2010 US
Child 13488349 US
Parent 12778943 May 2010 US
Child 13488349 US
Parent 12778943 May 2010 US
Child 12906036 US
Parent 12346026 Dec 2008 US
Child 13488349 US