Systems and methods for processing an image of a check during mobile deposit

Information

  • Patent Grant
  • 11321678
  • Patent Number
    11,321,678
  • Date Filed
    Thursday, March 26, 2020
    4 years ago
  • Date Issued
    Tuesday, May 3, 2022
    2 years ago
Abstract
An image of a check that is in the field of view of a camera is monitored prior to the image of the check being captured. The camera is associated with a mobile device. When the image of the check in the field of view passes monitoring criteria, an image may be taken by the camera and provided from the mobile device to a financial institution. The image capture may be performed automatically as soon as the image of the check is determined to pass the monitoring criteria. The check may be deposited in a user's bank account based on the image. Any technique for sending the image to the financial institution may be used. Feedback may be provided to the user of the camera regarding the image of the check in the field of view.
Description
BACKGROUND

Checks typically provide a safe and convenient method for an individual such as a payor to transfer funds to a payee. To use a check, the individual usually opens a checking account, or other similar account, at a financial institution and deposits funds, which are then available for later withdrawal. To transfer funds with a check, the payor usually designates a payee and an amount payable on the check. In addition, the payor often signs the check. Once the check has been signed, it is usually deemed negotiable, meaning the check may be validly transferred to the payee upon delivery. By signing and transferring the check to the payee, the payor authorizes funds to be withdrawn from the payor's account on behalf of the payee.


While a check may provide a payor with a convenient and secure form of payment, receiving a check may put certain burdens on the payee, such as the time and effort required to deposit the check. For example, depositing a check typically involves going to a local bank branch and physically presenting the check to a bank teller. To reduce such burdens for the payee, systems and methods have been developed to enable the remote deposit of checks. For example, the payee may capture a digital image of a check using a mobile device. The financial institution may then receive from the payee the digital image of the check. The financial institution may then use the digital image to credit funds to the payee. However, such a technique requires the efficient and accurate detection and extraction of the information pertaining to a check in the digital image. Capturing a digital image at a mobile device that allows for subsequent detection and extraction of the information from the digital image is difficult.


SUMMARY

An image of a check that is in the field of view of a camera is monitored prior to the image of the check being captured. The camera is associated with a mobile device. The monitoring may be performed by the camera, the mobile device, and/or a financial institution that is in communication with the mobile device. When the image of the check in the field of view passes monitoring criteria, an image may be taken by the camera and provided from the mobile device to a financial institution. The check may be deposited in a user's bank account based on the image. Any technique for sending the image to the financial institution may be used.


In an implementation, the image capture may be performed automatically by the camera, the mobile device, and/or the financial institution as soon as the image of the check is determined to pass the monitoring criteria. In an implementation, feedback may be provided to the user of the camera regarding the image of the check in the field of view. The user may reposition the check and/or the camera, for example, responsive to the feedback. Alternatively, the user may capture an image of the check responsive to the feedback.


This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing summary, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the embodiments, there are shown in the drawings example constructions of the embodiments; however, the embodiments are not limited to the specific methods and instrumentalities disclosed. In the drawings:



FIG. 1 is a block diagram of an implementation of a system in which example embodiments and aspects may be implemented;



FIG. 2 shows a high-level block diagram of an implementation of a system that may be used for the deposit of a check;



FIG. 3 is a diagram of an example image comprising a check image, a background image, and feedback;



FIG. 4 is a diagram of an example image divided into segments that may be used for monitoring the image;



FIG. 5 is a diagram of an example histogram for a segment of an image comprising check data and background data;



FIG. 6 shows a data flow diagram of a system for the deposit of a check, in accordance with an example embodiment;



FIG. 7 shows a block diagram of a client apparatus and a server apparatus for the deposit of a check, in accordance with an example embodiment;



FIG. 8 is an operational flow of an implementation of a method that may be used for deposit of a check using image monitoring of the check;



FIG. 9 is an operational flow of another implementation of a method that may be used for deposit of a check using image monitoring of the check; and



FIG. 10 is a block diagram of an example computing environment in which example embodiments and aspects may be implemented.





DETAILED DESCRIPTION

In the following detailed description of example embodiments, reference is made to the accompanying drawings, which form a part hereof and in which is shown, by way of illustration, specific embodiments in which the example methods, apparatuses, and systems may be practiced. It is to be understood that other embodiments may be used and structural changes may be made without departing from the scope of this description.



FIG. 1 is a block diagram of an implementation of a system 100 in which example embodiments and aspects may be implemented. System 100 may include an account owner, referred to herein as a user 102, and financial institutions 130, 140, and 150, which may be any type of entity capable of processing a transaction involving a negotiable instrument. For example, financial institutions 130, 140, and 150 may be a retail bank, an investment bank, an investment company, a regional branch of the Federal Reserve, a clearinghouse bank, and/or a correspondent bank.


A negotiable instrument typically includes a type of contract that obligates one party to pay a specified sum of money to another party. Negotiable instrument as used herein is an unconditioned writing that promises or orders payment of a fixed amount of money. One example of a negotiable instrument is a check. The check may be taken by the receiving party and deposited into an account at a financial institution of the receiving party. The receiving party may endorse the check and then present it for deposit at a bank branch, via an automated teller machine (ATM), or by using remote deposit. Other examples of negotiable instruments include money orders, cashier's checks, drafts, bills of exchange, promissory notes, and the like. A money order is a trusted financial instrument that is a payment order for a pre-specified amount of money. A cashier's check (also known as a bank check, official check, teller's check, bank draft or treasurer's check) is a check guaranteed by a bank and may be purchased from a bank.


The user 102 may be an individual or entity who owns account 160 that may be held at financial institution 130. Account 160 may be any type of deposit account for depositing funds, such as a savings account, a checking account, a brokerage account, and the like. The user 102 may deposit a check 108 or other negotiable instrument in the account 160 either electronically or physically. The financial institution 130 may process and/or clear the check 108 or other negotiable instrument. The user 102 may communicate with financial institution 130 by way of communications network 120 such as an intranet, the Internet, a local area network (LAN), a wide area network (WAN), a wireless fidelity (WiFi) network, a public switched telephone network (PSTN), a cellular network, a voice over Internet protocol (VoIP) network, and the like. The user 102 may communicate with financial institution 130 by phone, email, instant messaging, text messaging, web chat, facsimile, mail, and the like. Financial institutions 130, 140, and 150 also may communicate with each other by way of communications network 120.


In an implementation, the user 102 may receive payment from another individual such as a payor in the form of a check 108 or other negotiable instrument that is drawn from account 170 at financial institution 150. The user 102 may endorse the check 108 (e.g., sign the back of the check 108) and indicate an account number on the check 108 for depositing the funds. It is noted that although examples described herein may refer to a check, the techniques and systems described herein are contemplated for, and may be used for, deposit of any negotiable instrument. Similarly, the techniques and systems described herein are contemplated for and may be used with any form or document whose image may be captured with a camera or other imaging device of a mobile device for subsequent storage and/or processing.


As described further herein, a digital image of a check or other negotiable instrument may be provided from a user to a financial institution, and the digital image may be processed and funds associated with the check or negotiable instrument in the digital image may be deposited in a user's bank account. The user 102 may deposit the check 108 into account 160 by making a digital image of the check 108 and sending the image file containing the digital image to financial institution 130. For example, after endorsing the check 108, the user 102 may use a mobile device 106 that comprises a camera to convert the check 108 into a digital image by taking a picture of the front and/or back of the check 108. The mobile device 106 may be a mobile phone (also known as a wireless phone or a cellular phone), a personal digital assistant (PDA), or any handheld computing device, for example. Aspects of an example mobile device are described with respect to FIG. 10.


To increase the likelihood of capturing a digital image of the check 108 that may be readable and processed such that the check 108 can be cleared, the image is monitored for compliance with one or more monitoring criteria, prior to the image of the check 108 being captured. The monitoring criteria may be directed to proper lighting and/or framing of the check 108 in an image of the check 108 that will be captured and presented for clearing of the check 108. An application may monitor whether the check 108 is sufficiently within the frame of the camera and has a high enough quality for subsequent processing. The monitoring is performed with respect to the image as it appears in the field of view of the camera of the mobile device 106. The field of view is that part of the world that is visible through the camera at a particular position and orientation in space; objects outside the field of view when the image is captured are not recorded in the image. The monitoring criteria may be based on one or more of light contrast on the image, light brightness of the image, positioning of the image, dimensions, tolerances, character spacing, skewing, warping, corner detection, and MICR (magnetic ink character recognition) line detection, as described further herein. In an implementation, one or more histograms may be determined using the image being monitored. The histograms may be used in conjunction with monitoring criteria, as described further herein.


The monitoring may be performed by the camera, the mobile device 106, and/or a financial institution that is in communication with the mobile device 106. When the image of the check 108 in the field of view passes the monitoring criteria, an image may be taken by the camera and provided from the mobile device 106 to a financial institution. By ensuring that the image of the check passes monitoring criteria during pre-image capture monitoring, the number of non-conforming images of checks is reduced during presentment of the images to a financial institution for processing and clearing. In an implementation, feedback may be provided to the user 102 regarding the image of the check in the field of view. Based on the feedback, the user 102 may reposition the check 108 and/or the camera, for example, or may capture an image of the check 108.


In an implementation, the image capture may be performed automatically by the camera, the mobile device 106 and/or the financial institution as soon as the image of the check 108 is determined to pass the monitoring criteria. Alternatively, the user 102 may manually instruct the camera to perform the image capture (e.g., by pressing a button the camera or the mobile device 106) after the user 102 receives an indication or other feedback that the image passes the monitoring criteria.


In an implementation, the user 102 may send the digital image(s) to financial institution 130 using the mobile device 106. Any technique for sending a digital image to financial institution 130 may be used, such as providing a digital image to a website associated with financial institution 130 from storage, emailing a digital image to financial institution 130, or sending a digital image in a text message or instant message, for example.


Financial institution 130 may receive a digital image representing the check 108 and may use any known image processing software or other application(s) to obtain the relevant data of the check 108 from the digital image. Financial institution 130 may determine whether the financial information associated therewith may be valid. For example, financial institution 130 may include any combination of systems and subsystems such as electronic devices including, but not limited to, computers, servers, databases, or the like. The electronic devices may include any combination of hardware components such as processors, databases, storage drives, registers, cache, random access memory (RAM) chips, data buses, or the like and/or software components such as operating systems, database management applications, or the like. According to an embodiment, the electronic devices may include a network-based server that may process the financial information and may receive the digital image from the user 102.


The electronic devices may receive the digital image and may perform an analysis on the quality of the digital image, the readability of the data contained therein, or the like. For example, the electronic devices may determine whether the account number, amount payable, and the like may be readable such that it may be parsed or otherwise obtained and processed by the financial institution to credit an account 160 associated with the user 102 and debit an account associated with the payor. In an implementation, a representative 135 of financial institution 130 may provide assistance to the user 102 and may provide assistance in determining whether the financial information may be readable and/or of a good enough quality to be processed.


Upon receipt and approval of the digital image, financial institution 130 may credit the funds to account 160. Financial institution 130 may clear the check 108 by presenting a digital image of the check 108 captured from the digital image to an intermediary bank, such as a regional branch of the Federal Reserve, a correspondent bank, and/or a clearinghouse bank. For example, the check 108 may be cleared by presenting the digital image to financial institution 140, which may be a regional branch of the Federal Reserve, along with a request for payment. Financial institutions 130 and 150 may have accounts at the regional branch of the Federal Reserve. Financial institution 130 may create a substitute check using the image provided by the user 102 and present the substitute check to financial institution 140 for further processing. Upon receiving the substitute check, financial institution 140 may identify financial institution 150 as the paying bank (e.g., the bank from which the check 108 is drawn). This may be accomplished using a nine digit routing number located on the bottom left hand corner of the check. A unique routing number is typically assigned to every financial institution in the United States. Financial institution 140 may present the substitute check to financial institution 150 and request that the check be paid. If financial institution 150 verifies the check (i.e., agrees to honor the check), financial institution 140 may then settle the check by debiting funds from financial institution 150 and crediting funds to financial institution 130. Financial institution 150 may then debit funds from account 170.


It will be appreciated that the preceding examples are for purposes of illustration and explanation only, and that an embodiment is not limited to such examples. For example, financial institution 150 may be a correspondent bank (i.e., engaged in a partnership with financial institution 130). Thus, financial institution 130 may bypass the regional branch of the Federal Reserve and clear the check directly with financial institution 150. In addition, account 160 and account 170 may both be held at financial institution 130, in which case the check 108 may be cleared internally.


In an implementation, the mobile device 106 may comprise a video source such as a video camera, a web camera, or a video-enabled phone, for example, to obtain a video of the check 108. A frame of the video may be obtained and monitored with respect to monitoring criteria, as described further herein. The mobile device 106 and/or the institution may obtain the frame and monitor the frame, depending on an implementation. Generation of a live video of a check 108 is not limited to a video camera, a web camera, and a video-enabled phone, and it is contemplated that any device that is capable of generating a live video may be used to make a video of the check 108 which may be monitored in real-time with respect to monitoring criteria. Additional devices that may be used in the generation and/or transmission of a live video include a web-enabled video computing device, a mobile phone, a camcorder, and a computer camera, for example.



FIG. 2 shows a high-level block diagram of an implementation of a system 200 that may be used for the deposit of a check, such as the check 108. As described further herein, the user 102 may deposit the funds of the check 108 using the camera functionality in the mobile device 106. In the example of one person giving a check to another person, this would enable the receiving party to deposit the funds at that time, without physically visiting an ATM or a bank branch.


In an implementation, the mobile device 106 may comprise a camera 207, such as a digital camera. Such a mobile device may be called a camera phone. The mobile device 106, through the camera 207, has the ability to take or capture a picture or digital image of the check 108 or other negotiable instrument. The camera 207 may take an image of the front of the check 108. Alternatively, the camera 207 may take an image of both the front and the back of the check 108. The back of the check may provide endorsement verification, such as the signature of the person or party the check is made out to.


In an implementation, prior to an image in the field of view of the camera 207 being captured by the camera 207, the image may be monitored with respect to monitoring criteria, e.g., using a software application running on the mobile device 106. Feedback based on the monitoring of the image may be provided to the user 102 to assist the user 102 in positioning the check 108 so that the image of the check 108 may be captured in such a manner that it may be more easily processed and cleared during subsequent operations, such as those involving one or more financial institutions.


A depository 204 may include a bank in which the user 102 has a deposit account; however, the present disclosure is not limited to just banks. Alternatively, a third party may act as the depository 204 providing functionality to a plurality of users without regard to the bank at which they have deposit accounts, or whether their individual bank allows for the methods and systems described herein. The depository 204, in an implementation, after receiving the image(s) of the check 108 from the user 102, may use a clearinghouse 210 to perform the check clearing operations. As described with respect to the system 100 of FIG. 1, check clearing operations are used by banks to do the final settlement of the check 108, such as removing funds from the account of the payor and transferring those funds to the user's bank. The user's bank may choose to make the funds available to the user 102 immediately and take on the risk that the check 108 does not clear. However, for various reasons, the bank may only make those funds available to the user 102 after the check 108 finally clears.


In an implementation, the user 102 may place the check 108 on a background and generate a digital image comprising an image of the check (e.g., a check image) and a portion of the background (e.g., a background image) using the camera 207. Any background may be used, although a dark background or a consistently colored background may provide more optimal results. It is noted that although examples and implementations described herein may refer to a check image and check data, the term “check image” may refer to any foreground image in a digital image (as opposed to the background image) and the term “check data” may refer to any foreground data in a digital image (as opposed to background data) Thus, the “check image” and the “check data” may refer to the foreground image and foreground data in implementations involving any negotiable instrument, form, or document.


In an implementation, the image being monitored in the field of view of the camera 207 comprises check data and background data. The check data pertains to the check image and the background data pertains to the background image (e.g., the background on which the check image is disposed).



FIG. 3 is a diagram of an example image 230 comprising a check image 247, a background image 250, and a feedback indicator 235 providing feedback to the user 102. The image 230 may be generated by an imaging device associated with the mobile device 106, such as the camera 207. An edge 245 separates the check image 247 from the background image 250. The edge 257 may be detected using any known technique(s). The image 230 may be provided in the field of view of the camera 207 prior to and during image capture of the check 108. The user 102 may adjust the camera 207, the check 108, and/or any light source so that the image 230 passes one or more monitoring criteria. For example, a light source from a specific angle can lead to poor light contrast. Light contrast may be a monitoring criterion, and poor light contrast may be corrected easily by moving the lens of the camera 207 to a different perspective, thereby allowing the image to pass the monitoring criterion.


Feedback regarding the image 230 in the field of view with respect to the monitoring criteria may be generated and provided to the user 102. In an implementation, the feedback may be provided visually, such as by text (e.g., “go closer”, “go farther”, “move the check to the right”, “put the check on a darker background”, “tilt the camera down”, “take the picture now”, etc.), arrows, or other visual indicators or cues (e.g., green lights, red lights, etc.) overlaid on the image 230 shown as feedback indicator 235. Alternatively or additionally, feedback may be provided to the user 102 aurally, such as through a speaker associated with the mobile device 106. The feedback may advise the user 102 to move the camera 207 or the check 108 or adjust the lighting or the background, for example. The feedback may also advise the user 102 when the image 230 passes the one or more monitoring criteria and to capture the image of the check 108.


One of the monitoring criteria may be based on the positioning of the check 180 in the image 230. The positioning of the check 108 may be determined from the image 230 and compared with predetermined dimensions (e.g., of a typical personal check, of a typical business check) and tolerances. If the dimensions are within a certain acceptable tolerance, then it may be determined that the check 108 is properly positioned. Such feedback may be generated and provided to the user 102.


In an implementation, the positioning of the check 108 in the image 230 may be compared with an alignment guide (which may or may not be visible to the user 102 in the field of view of the camera 207). For example, measurements may be made by a processor in the camera 207, the mobile device 106, or a computing device at the financial institution to determine the check's position with respect to the alignment guide. The measurements may be compared to predetermined measurements or values to determine whether the check's positioning in the image 230 is proper or sufficient for further processing of the image. Edge detection and/or corner detection may be used in such measurements (e.g., in measuring the distance from the check 108 in the image 230 to the alignment guide). Any known technique(s) for edge detection and/or corner detection may be used. In an implementation, corner detection itself may be a monitoring criterion, such that if corner detection of the check 108 in the image 230 is achieved, then it may be concluded that the image 230 may be properly processed and cleared by a depository (i.e., the image 230 passes the monitoring criteria).


The alignment guide may be overlaid on the camera feed of the mobile device 106, in an implementation. The alignment guide may take any shape such as a bounding rectangle or other bounding box or shape, horizontal and/or vertical bars, parallel lines, etc., for example. With a bounding rectangle, for example, used as the alignment guide, aligning the check 108, thereby passing this monitoring criterion, means enclosing the check 108 within the bounding rectangle. If the check 108 is outside of the alignment guide in the image 230, feedback may be generated and provided to the user 102 regarding this monitoring criterion with instructions for moving the check 108 or the camera 207 in order to properly align the check 108 in the field of view.


The operator of the camera 207 may introduce distortions in the image due to a perspective problem, specifically an angling of the camera vertically over the check, and the top of the check is smaller than the bottom, or the reverse. Monitoring criteria may also be directed to determining whether the image is skewed or warped. Skewing occurs when the check 208 is rotated from the horizontal in the image 230. By measuring the distance from the edge(s) of the check 208 in the image to an alignment guide or the edge of the field of view, it may be determined whether the check 208 is skewed (e.g., by comparing the distances to one another, by comparing the distances to predetermined values, etc.). If skewing is present in the image 230, feedback may be generated and provided to the user 102 with instructions for moving the check 108 or the camera 207 in order to properly align the check 108 in the field of view with respect to the horizontal.


Warping, as used herein, is meant to denote that the check 108 is tilted forward or back with respect to a plane that is perpendicular to a line drawn from the camera lens to the center of the check 108. Warping, or tilting, of the image may lead to incorrect optical detection of the check 108. In an implementation, a processor in the camera 207, the mobile device 106, or a computing device at the financial institution may determine whether warping is present in the image, and if so, may generate and provide feedback to the user 102. Such feedback may comprise instructions to the user 102 for moving the check 108 or the camera 207 such that the check 108 would appear to be perpendicular to an imaginary line drawn from the center of the camera lens to the center of the check 108 itself (e.g., dewarping instructions).


If user involvement is tolerated, the user may be queried to supply or identify one or more corners of the check 108 in the image 230. The perimeter of the check 108 may be determined using this information. Additionally, this information may be used for monitoring the image 230 for distortions.


In an implementation, a monitoring criterion may be whether the MICR line can be detected and/or read. Any known MICR line detection technique(s) may be used by the camera 207, the mobile device 106, and/or the financial institution (e.g., using an image processor, for example) to detect the MICR line on the check 108 in the image 230. If the MICR line can be detected, it may be determined that the image 230 may be captured and sent to the financial institution for processing and clearing of the check 108 (i.e., the image passes the monitoring criterion directed to MICR line detection). If the MICR line cannot be detected, feedback may be provided to the user 102, such as to reposition the check 108 and/or the camera 207 (i.e., the image fails to pass the monitoring criterion, perhaps because the image is out focus or the lighting is inadequate, for example).


In an implementation, spacing between certain characters, points, or features (e.g., MICR number, “$” sign, signature line, courtesy amount line, legal amount line, etc.) may be determined and used as a monitoring criterion. For example, if the MICR line can be detected, then the spacing between the numbers in the MICR line may be determined using any known measuring and/or image processing technique(s). If the spacing is outside of a certain range corresponding to valid spacing between number in a MICR line, then it may be determined that the image 230 may be not properly processed if captured by the camera 207. In such a case, feedback may be generated and provided to the user 102, such as to reposition the check 108 and/or the camera 207.


Another monitoring criterion may be based on the light in the image 230 such as the light contrast and/or light brightness found on the image 230, such as in various regions of the image 230. For example, if the light contrast between the check image 247 and the background image 250 is less than a predetermined amount, then it may be determined that the image 230 may be not properly processed if captured by the camera 207. In such a case, instead of capturing the image 230, feedback may be generated and provided to the user 102 to adjust the camera 207, the check 108, and/or the lighting in order to bring the image 230 into compliance with the monitoring criteria.


As another example, the light brightness on various regions of the image may be determined and compared to each other and/or may be compared to a predetermined threshold. If the difference between the light brightness of the various regions is less than a predetermined amount (e.g., the light brightness does not vary significantly among the regions or if the light brightness is less than a predetermined threshold, then it may be determined that the image 230 may be properly processed if captured by the camera 207. Otherwise, feedback may be generated and provided to the user 102 to adjust the camera 207, the check 108, and/or the lighting in order to change the light brightness on the image 230.


In an implementation, one or more histograms may be generated based on the image 230 and used in the determination of light contrast and/or light brightness monitoring criteria. A histogram is a well known graph and may be used to display where all of the brightness levels contained in an image are found, from the darkest to the brightest. These values may be provided across the bottom of the graph from left (darkest) to right (brightest). The vertical axis (the height of points on the graph) shows how much of the image is found at any particular brightness level.


Histograms may be used to monitor whether the light on the image 230 is uniform, not too bright, etc. For example, the mobile device 106 can monitor the histogram of the image 230 to ensure that there is a large contrast between the background image 250 and the check image 247. Feedback may be provided to the user 102 as to how to move or adjust the camera, lighting, etc. in order to get a good image for subsequent processing (i.e., how to get an image that passes the monitoring criteria).


In an implementation, the image 230 may be divided into segments, such as those shown in FIG. 4. FIG. 4 is a diagram of the example image 230 of FIG. 3 divided into segments 260, 265, 270, 275 that may be used for monitoring the image 230. Although four segments are shown in FIG. 4, any number of segments may be used with techniques described herein. Although the segments 260, 265, 270, 275 are formed by dividing the image 250 into quadrants, the segments may be formed by any techniques, take any shape, and have any area, subject to a constraint that each segment comprises a portion of the check data 247 and a portion of the background data 250 separated by a portion of the edge 245. In this manner, distinct areas of density corresponding to the background of the image and the check data of the image may be provided in a histogram.



FIG. 5 is a diagram of an example histogram 280 for a segment of an image comprising check data and background data. The horizontal axis of the histogram 280 represents the grayscale level between 0 and 255, where 0 represents true black and 255 represents true white. The vertical axis represents the amount of the image at a particular grayscale level of the horizontal axis. Any known technique for generating a histogram for an image (such as a grayscale image of the image 230) may be used. The histogram 280 shows two distinct areas of density (i.e., two distinct density distributions). The density area 286 closer to the grayscale level of zero corresponds to the background of the image, and the density area 289 closer to the grayscale level of 255 corresponds to the check data.


The density distribution for each segment (or for the entire image 230) may be analyzed to determine whether the light contrast and/or light brightness is appropriate for processing and clearing of the check 108 in the image 230 (and thus passes that monitoring criterion) or whether the light contrast and/or light brightness does not pass the monitoring criterion and the camera 207, the check 108, and/or the light source should be adjusted or repositioned. For example, the density distributions for the segments may be compared with each other and/or may be compared to predetermined values or levels. If the differences are less than a predetermined difference amount, such as less than 1 percent different, less than 5 percent different, etc., then the image 230 may be captured and sent to the financial institution for processing. Otherwise, feedback may be generated and provided to the user 102 to reposition the camera 207, the check 108, and/or the light source.


When the image 230 passes the monitoring criteria (e.g., is positioned properly with respect to an alignment guide, is not warped, is not skewed, has adequate light brightness and/or light contrast, etc.), the image 230 may be captured either automatically (e.g., by the camera or the mobile device under direction of an application running on the camera 207 or the mobile device 106 or the financial institution) or manually (e.g., by the user 102 pressing a button or making a selection on the camera 207 or the mobile device 106). The digital image thus captured may be provided from the mobile device 106 to a financial institution. The check 108 may be deposited in a user's bank account based on the digital image. Any technique for sending the digital image to the financial institution may be used.



FIG. 6 shows a data flow diagram 300 of a system for the deposit of a check, in accordance with an example embodiment. In the data flow diagram 300, a client 320 is one example of the mobile device 106 of the user 102 described with respect to the systems 100 and 200 of FIGS. 1 and 2, respectively. In an implementation, a server 322 may be a software component operable by the depository 204 of FIG. 2. The client 320 may log in to a remote deposit system executed on the server 322. The login 325 may serve to authenticate the user 102 as an authorized consumer of the depository 204.


The server 322, in one example, may send instructions 330 to the client 320 that execute an application on the client 320. This may include instructions that cause a software object, which may have been previously downloaded and installed (e.g., pre-installed) on the client 320, to be executed on the client 320. The software object may analyze the image in the field of view of a digital camera (e.g., the image 230 shown in the field of view of the camera 207 associated with the mobile device 106) with respect to one or more monitoring criteria and may generate and provide feedback to the user regarding the monitoring criteria and/or instructions for capturing an image of the check 108.


In another example, the instructions 330 may include a wholly self-contained application that when delivered to the client 320 will execute and perform one or more operations described herein, such as those directed to analyzing the image in the field of view of the camera 207 with respect to monitoring criteria and providing feedback to the user 102. In either example, the software object may be configured to make one or more software calls 310 to the camera 207. This may be through specific software instructions to the camera 207. In other words, the camera's functionality may not be abstracted through any software library. In such an example, software code may be written and delivered to every different camera-equipped mobile phone.


In an alternate example, the software object may operate through a software abstraction layer, such as an application programming interface (API). The software object developer may only insert code into the software object to call one or more APIs exposed by the software operating the mobile device 106. One example of such software is Windows Mobile by Microsoft Corporation. In the context of a Windows Mobile device, the Windows Mobile operating system (OS) has one or more APIs exposed to application developers that will translate instructions from applications into instructions operable by the camera 207 on the mobile device 106. A mobile operating system, also known as a mobile platform or a handheld operating system, is the operating system that controls a mobile device. Other mobiles OSs include Symbian OS, iPhone OS, Palm OS, BlackBerry OS, and Android.


The software object may cause the camera 207 to analyze an image in the field of view with respect to monitoring criteria, provide feedback, and/or take a picture or capture one or more images of the check 108 being deposited. These images may be captured sequentially, e.g., pursuant to the user 102 flipping the check 108 over after an image of the front of the check 108 has been captured after passing the monitoring criteria. However, each side of the check 108 may be captured by the camera 207 using similar API calls. The images may be stored in an image file 315.


Once the images of one or both sides of the check 108 pass the monitoring criteria and are captured by the camera 207, the image file 315 may be operated on by the software object of the client 320. These operations may include any of the following: deskewing, dewarping, magnetic ink character recognition, cropping (either automatically, or having the user 102 manually identify the corners and/or edges of the check 108 for example), reducing the resolution of the image, number detection, character recognition, and the like.


With respect to number and character recognition, commercial check scanners have used characteristics of the MICR encoding to detect information about the check, such as the bank's routing number and the account number. However, the characteristics that these scanners have used are the magnetic characteristic of the ink itself and these scanners have used methods similar to those of magnetic audio tape readers. In an implementation, a software object of the client 320 may optically recognize the characters on the MICR line, as a consumer mobile device such as the mobile device 106 will lack the magnetic reading ability of a commercial check scanner.


The image may be also down converted into a grayscale or black and white image, such as either in Joint Photographic Experts Group (JPEG) compliant format or in tabbed image file format (TIFF) for example. In an alternate example, the image may be formatted as a Scalable Vector Graphics (SVG) image. One of the benefits of an SVG file is a large size advantage over JPEG. In the former example, the image at some point before entry into the clearing system may be converted to TIFF format. This may be performed at the mobile device 106, wherein the camera 207 captures the image in TIFF format. However, the camera 207 of the mobile device 106 may capture the image in JPEG format, which may then be converted into TIFF either at the mobile device 106 or at the server 322. In the latter example, this may use the transmission of the TIFF image across a communications network which may be more advantageous as TIFF images are typically smaller in file size for the same size of picture as a JPEG formatted image.


The software object on the client 320 may operate by performing one or more of the operations described herein and then transmitting an image file 335 (e.g., based on image file 315 that has been processed) to the server 322 after the user 102 confirms that they do wish to deposit the check 108. Alternately, the software object may capture the image of the check 108 and transmit that image to the server 322 that in turn may perform those operations, verifies that the image quality is within acceptable thresholds, and communicates that verification back to the client 320, which can then instruct the user 102 to take a picture of the other side of the check 108. In this example, the image transmitted to the server 322 may be in any format, such as JPEG or TIFF, insofar as the server software has the ability to convert that image into a Check 21 compliant format. Alternately, the bank may output an X9.37 file to the clearing system. The Check Clearing for the 21st Century Act (or Check 21. Act) is a United States federal law that allows the recipient of a paper check to create a digital version, thereby eliminating the need for further handling of the physical document. The Check 21 standard for electronic exchange is defined in the standard DSTU X9.37-2003 (“X9.37”). It is a binary interchange format.


The server 322 may confirm (e.g., using a process confirmation 340) with the user 102 the transmission, reception, and processing of each side of the check 108 separately, or may confirm both sides at the same time. On the server side, more operations may be performed, such as signature verification. Where to perform these operations may be determined by the processing power of the mobile device 106 itself, which is typically limited in computational power. However, the present discussion is not limited in any way by discussion of where certain operations are described as operating. The operations of detecting and verifying information may be performed by the client 320 before the information is transmitted along with the image in the image file 335 to the server 322. Alternately, the software object(s) operating on the mobile device 106 may perform no operation other then capturing images of the front and back of the check 108 after passing the monitoring criteria, receiving confirmation that the user 102 wishes to proceed, and transmitting those images to the server 322, wherein the server 322 performs those operations.


In an implementation, after the image file 335 has been received by the server 322, the server 322 may send a process confirmation 340 to the client 320. The process confirmation 340 may request instructions from the client 320 to continue proceeding with the deposit now that the server 322 has received the image file 335. In response, the client 320 may send a deposit confirmation 345 to the server 322, instructing the server 322 to process the deposit of the check based on the image file 335 that had been received by the server 322.



FIG. 7 shows a block diagram of a client apparatus 450 and a server apparatus 570 for the deposit of a check, in accordance with an example embodiment. The client apparatus 450 may include one or more software objects operating on a mobile device 106, such as described above. The client apparatus 450 may include a communications module 452, a check processing module 454, and an image monitoring and capture module 456. The client apparatus 450 may receive, in one example, one or more check images 458 as an input and output one or more processed images 460.


In an implementation, the check images 458 may be received following a software call from the check processing module 454 to the image monitoring and capture module 456. In such an implementation, the image monitoring and capture module 456 may include the camera 207 contained within the mobile device 106. Alternately, the camera 207 may be detachably coupled to the mobile device 106 such as through a secure digital (SD) slot or over any suitable communications bus, such as USB (universal serial bus).


In an implementation, the image monitoring and capture module 456 may obtain an image and send the image to a financial institution (e.g., financial institution 130, the server 322, the server apparatus 570, etc.) for processing. In an implementation, the client apparatus 450 may comprise a browser such as a web browser, for accessing a website on the Internet or other network associated with a financial institution. The user may access the website and select a “monitor and capture image” link or similar icon, button or link, for example, displayed on the browser. Such a selection may call the image monitoring and capture module 456 on the client apparatus 450.


The communications module 452 may be configured, in one example, to receive and send data signals over a suitable communications network. This may include, without limitation, GSM/GPR3, HSDPA, CDMA, TDMA, 802.11, 802.16 and the like. While the bandwidth available to the mobile device 106 may be an implementation concern such discussion is outside the scope of the present discussion and any suitable wireless communications network is considered to be within the scope of the present discussion. With respect to the present discussion, the communications module 452 may receive one or more processed check images 460 from the check processing module 454 and may transmit them over the suitable communications network to the depository 204, as described herein.


The check processing module 454 may be configured, in one example, to cause the image monitoring and capture module 456 to monitor an image of at least one side of a check provided in a field of view of the camera 207 and then capture the image after it passes monitoring criteria. Compliance with the monitoring criteria is intended to ensure that the image of the check is suitable for one or more processing tasks. For instance, if the check is rotated 45 degrees clockwise when captured, the check processing module 454 or a software object operated on the server 322 described above may be unable to optically detect information on the check.


The check processing module 454 may perform one or more cleaning or processing operations on the captured image of the check. Such cleaning or processing may include dewarping and/or deskewing (if not part of the monitoring criteria, in an implementation), for example. Cleaning or processing may include down-converting the image received from the image capture module to a suitable size, such as 200 dots per inch (DPI) resolution or in a resolution range such as 200 DPI to 400 DPI, 300 DPI to 500 DPI, etc., and/or converting the image to grayscale or black and white. Such operation(s) may reduce the file size of the check image. Alternatively, the check processing module 454 may send instructions to the image monitoring and capture module 456 to cause the image monitoring and capture module 456 to capture an image of the check at a suitable resolution. The check processing module 454 may additionally perform any of the following operations, in further examples: convert from PEG to TIFF, detect check information, perform signature detection on the image of the check, and the like. The check processing module 454 may, alternatively, send the captured check image to the server described herein for such processing, and receive confirmation that the operations were completed before further operations can proceed.


The size of the file sent between the mobile device and the server may be small. This runs counter with respect to automatic check detection against a background. If captured in color, the contrast between check and background becomes easier. However, the processed image sent over the communications network may need to be smaller, and if the detection operation is performed by the server, it may be advantageous to convert the captured image to grayscale, or even black and white, before transmission to the server. Grayscale images are compliant with the Check 21 Act.


While “flat” is a fairly well known term to users, each user's appreciation of flat with respect to the camera lens of the camera 207 associated with the mobile device 106 may result in a problem with needing to align the check image programmatically or risk rejecting a large number of check images. As the image captured is a set of pixels, a tilted image will result in a jagged polygon rather than a perfect rectangle. Using convex hull algorithms, the check processing modules may create a smooth polygon around the boundary and remove the concavity of the check image. Alternatively, a rotating calipers algorithm may be used to determine the tightest fitting rectangle around the check boundary, which can then be used to determine the angle of it, with that angle being used to align the check properly.


The server apparatus 570 may include one or more software objects operating on a server operated by the depository 204. Aspects of an example server apparatus are described with respect to FIG. 10. The server apparatus 570 may include a communications module 572, a check processing module 574, and a check clearance module 576. The server apparatus 570 may receive one or more processed images 460 from a mobile device 106 or a client apparatus 450 as an input and may output a file such as a Check 21 compliant file 578. The Check 21 compliant file 578 may be a file or entry in a record set that is compliant with the clearinghouse rules set forth in the Check 21 Act and may include outputting an X9.37 file, in one example.


The communications module 572 may be configured to receive a wireless communication from the mobile device 106 over any suitable communications network, such as those described above. The communications module 572 may additionally receive a communication over a different communications network than the mobile device 106 communicated on, such as receiving the communication over a TCP/IP (Transmission Control Protocol/Internet Protocol) connection from the user's communication provider.


The check processing module 574 may be configured, in one example, to perform one or more check processing operations on the processed image(s) 460 that are received. In an implementation, these operations may include any of the operations described herein with respect to the check processing module 454. The operation of signature verification may be performed by the check processing module 574 of the server apparatus 570 as the server apparatus 570 may interface with other systems of the depository 204 that may maintain previously verified signature samples of the user 102. Performing signature verification at the client apparatus 450 may be computationally unfeasible; additionally, there may be a security risk if the signature sample is stored on the user's own device.


A cropped grayscale image may be sent to the server apparatus 570. The server apparatus 570 may extract information via a TIFF conversion and determine the DPI and re-scale to the proper DPI (e.g., convert to TIFF and detect the DPI that was used in the grayscale image). In an implementation, DPI detection may run on the client apparatus 450.


The check clearance module 576 may be configured, in one example, to receive a file from the check processing module 574 and may communicate with a check clearinghouse such that a Check 21 compliant file may be delivered to the check clearinghouse and funds may be received by the depository 204. The availability of the funds to the user 102 may be delayed by this operation such that the user 102 only has access to those funds when the depository 204 receives confirmation that the check has cleared.



FIG. 8 is an operational flow of an implementation of a method 800 that may be used for deposit of a check using image monitoring of the check. At 810, a request for access may be received from a user (e.g., the user 102). The user may request access to a deposit system operated by a depository (e.g., the depository 204) by way of a mobile device (e.g., the mobile device 106) such as a cellular phone, a PDA, a handheld computing device, etc. operated by the user. The access may be through some sort of user login, in some examples. The deposit system may be configured to receive a deposit of a negotiable instrument, such as a check, money order, cashier's check, etc. from the user and clear the negotiable instrument in a suitable clearinghouse system.


At 820, the system may initialize a software object on the mobile device. This may include sending instructions to the mobile device intended to execute a previously installed (i.e., pre-installed) software object. Alternatively, the system may send a software object to the mobile device that may execute the software object, carry out operations described herein by use of the software object, and terminate the software object. In an implementation, the system may instruct a camera associated with the mobile device to monitor and capture an image of the negotiable instrument in conjunction with monitoring criteria.


The user may use the camera to obtain an image in the field of view of the camera, and at 830, the image in the field of view of the camera may be monitored with respect to one or more monitoring criteria, such as those described above. The monitoring may be performed by the camera, the mobile device, and/or a computing device associated with the depository, for example. The monitoring may be performed pursuant to instructions received at the camera or mobile device from the deposit system operated by a depository, the server 322, or the server apparatus 570, for example. In an implementation, the results of the monitoring may indicate that the camera and/or the check should be repositioned and/or the light source should be adjusted prior to an image capture in order to capture an image of the check that may be processed properly, e.g., to have the data from the check obtained without error from the image, so that that check can be cleared.


At 840, feedback based on the results may be generated and provided visually and/or aurally to the user via the camera and/or the mobile device. In an implementation, the feedback may be provided if the image fails to pass the monitoring criteria. The feedback may comprise instructions or guidance for the user to follow to obtain an image of the check in the field of view of the camera that will pass the monitoring criteria. Processing may continue at 830 with the image that is currently in the field of view of the camera (after the user has received and acted on the feedback) being monitored with respect to the monitoring criteria.


When the image in the field of view passes the monitoring criteria as determined at 830, the image in the field of view may be captured by the camera at 850. This may be accomplished through the software object accessing a camera associated with the mobile device (e.g., either comprised within the mobile device or separate from the mobile device). This may be done through an API exposed by the OS of the mobile device, or may be through software code customized for a specific phone and specific camera. With respect to the former, a developer of the software object may write code to the camera API(s), which may be specific to the OS and without regard to the camera on the device. The user may initiate the capture of the image (e.g., by pressing a button on the camera or the mobile device) or the image may be captured automatically, without user intervention, as soon as the image in the field of view is determined to have passed the monitoring criteria. In this manner, the occurrence of non-conforming images downstream (e.g., at a depository or financial institution) is reduced, and there is a high confidence that the image will be properly processed downstream.


In an implementation, when the image in the field of view is determined to pass the monitoring criteria, feedback may be generated and provided to the user indicating so. The feedback may instruct the user to capture the image now (e.g., by pressing a button on the camera or mobile device) or may advise the user that the image has been captured, for example.


At 860, the captured image may be transmitted to the depository, e.g. as a digital image file. At 870, the depository may receive the image of the check (along with financial information pertaining to the account for depositing funds, for example) and may process the image. Processing of the digital image file may include retrieving financial information regarding the check. The financial information may comprise the MICR number, the routing number, an amount, etc. Any known image processing technology may be used, such as edge detection, filtering to remove imagery except the check image or check data in the received digital image file, image sharpening, and technologies to distinguish between the front and the back sides of the check. The depository may identify and/or remove at least a portion of data that is extraneous to the check, such as background data.


After retrieving the financial information from the check in an electronic data representation form, the depository may determine whether the financial information such as the amount payable to the user, the account associated with the user to deposit funds, an account associated with a payor to debit funds, and an institution associated with the payor, etc., may be valid. For example, the depository may include electronic devices such as computers, servers, databases, or the like that may be in communication with each other. The electronic devices may receive an electronic data representation and may perform an analysis on the quality of the data representation, the readability of the data representation, or the like. For example, the electronic devices may determine whether the account number, amount payable, or the like may be readable such that they may be parsed and processed by the depository to credit an account associated with the user.


If the financial information is determined to be valid, the electronic data representation may be processed by the depository, thereby depositing the money in the user's account. If the financial information is determined to be invalid, then the user may be advised. For example, the depository may transmit an email, a web message, an instant message, or the like to the user indicating that the financial information associated with the electronic data representation may be invalid. The user may determine how to proceed by selecting an option on the web message, replying to the email, or the like.


Thus, in an implementation, instructions on how the user would like to proceed may be requested from the user, such as whether the user would like to try the deposit again (e.g., make another image of the check that pass the monitoring criteria and send it to the depository) or whether the user would like assistance from a representative, for example. The user may indicate how they would like to proceed. If the user would like assistance, the financial information may be transferred to a representative for further review. The representative may review the financial information associated with the electronic data representation to determine whether to allow the electronic: data representation to be processed by the depository. If so, the electronic data representation of the financial information may be processed by the depository, thereby depositing the check in the user's account. The depository may send a notice to the user via email, facsimile, instant message, or mail, for example, that the check has been deposited into the selected account.



FIG. 9 is an operational flow of another implementation of a method 900 that may be used for deposit of a check using image monitoring of the check. A user (e.g., the user 102) may receive and endorse a check (e.g., the check 108) at 910 and open a communication pathway with an institution (e.g., the financial institution 130) at 920. In an implementation, the user may open a communication pathway with the institution by logging into a website of the institution, for example. There may be several ways in which a communication pathway may be established, including, but not limited to, an Internet connection via a website of the institution. The user may access the website and log into the website using credentials, such as, but not limited to, a username and a password.


At 930, the user may send a request to deposit the check and may select an account in which to deposit the check. In an implementation, the user may select a “deposit check” option provided on the website, and may enter details such as check amount, date, the account the check funds should be deposited in, comments, etc.


At 940, an image in the field of view of the camera may be obtained and provided, via the communication pathway, to the institution. A still image may be provided or a video may be provided, such as a video stream generated by the camera.


At 950, the institution may receive the image or video stream and may analyze the image or a frame of the video stream with respect to one or more monitoring criteria, such as those described above. Feedback pertaining to the image with respect to the monitoring criteria may be generated and provided to the user over the communication pathway. Based on the feedback, the user may adjust the position the camera and/or the check and/or may adjust the light source until the image in the field of view of the camera is determined by the institution to pass the monitoring criteria.


When the image in the field of view passes the monitoring criteria, the image in the field of view may be captured (e.g., automatically without user intervention or pursuant to the user pressing a button) by the camera at 960, thereby creating a digital image of the check. In an implementation, the user may instruct the camera (e.g., by pressing a button on the camera or the mobile device) to create the digital image. In another implementation, the camera may automatically create the digital image as soon as the image of the check passes the monitoring criteria. In this manner, the user may point the camera at the check such that the image of the check appears in the field of view, and after image has been determined to pass the monitoring criteria, a digital image of the check may be created without further user intervention. Depending on the implementation, one or more digital images of the check (e.g., corresponding to the front and back of the check) may be created using such techniques.


At 970, the digital image(s) may be uploaded to the institution using any known image upload process. In an implementation, the upload may be augmented by secondary data which may be information relating to the deposit of the check, such as an account number and a deposit amount, for example. At 980, when the institution has received the digital images (e.g., of the front and back sides of the check), the institution may process the digital images to obtain an image of the check and to deposit the funds of the check in the user's account, as described herein. It is contemplated that processing such as grayscale conversion, image cropping, image compression, edge and/or corner detection, etc. may be implemented in the method 900. Such operations may be performed on one or more digital images created by the camera and may be performed on the image(s) by the mobile device and/or by the institution, as described further above.


Although the examples described herein may refer to uploading of images of checks to an institution, it is contemplated that any negotiable instrument or image (e.g., vehicle accident pictures provided to an insurance company) may be processed and/or transmitted using the techniques described herein. Additionally, one or more of the techniques described herein may be performed by the institution instead of the mobile device of the user.



FIG. 10 is a block diagram of an example computing environment in which example embodiments and aspects may be implemented. The computing system environment is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Numerous other general purpose or special purpose computing system environments or configurations may be used. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers (PCs), server computers, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputers, mainframe computers embedded systems, distributed computing environments that include any of the above systems or devices, and the like.


Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data may be located in both local and remote computer storage media including memory storage devices.


With reference to FIG. 10, a system 1000 includes a computer 1010 connected to a network 1014. The computer 1010 includes a processor 1020, a storage device 1022, an output device 1024, an input device 1026, and a network interface device 1028, all connected via a bus 1030. The processor 1020 represents a central processing unit of any type of architecture, such as a CISC (Complex Instruction Set Computing), RISC (Reduced Instruction Set Computing), VLIW (Very Long Instruction Word), or a hybrid architecture, although any appropriate processor may be used. The processor 1020 executes instructions and includes that portion of the computer 1010 that controls the operation of the entire computer. Although not depicted in FIG. 10, the processor 1020 typically includes a control unit that organizes data and program storage in memory and transfers data and other information between the various parts of the computer 1010. The processor 1020 receives input data from the input device 1026 and the network 1014 reads and stores code and data in the storage device 1022 and presents data to the output device 1024. Although the computer 1010 is shown to contain only a single processor 1020 and a single bus 1030, the disclosed embodiment applies equally to computers that may have multiple processors and to computers that may have multiple busses with some or all performing different functions in different ways.


The storage device 1022 represents one or more mechanisms for storing data. For example, the storage device 1022 may include read-only memory (ROM), RAM, magnetic disk storage media, optical storage media, flash memory devices, and/or other machine-readable media. In other embodiments, any appropriate type of storage device may be used. Although only one storage device 1022 is shown, multiple storage devices and multiple types of storage devices may be present. Further, although the computer 1010 is drawn to contain the storage device 1022, it may be distributed across other computers, for example on a server.


The storage device 1022 includes a controller (not shown in FIG. 10) and data items 1034. The controller includes instructions capable of being executed on the processor 1020 to carry out functions previously described herein with reference to FIGS. 1-9. In another embodiment, some or all of the functions are carried out via hardware in lieu of a processor-based system. In one embodiment, the controller is a web browser, but in other embodiments the controller may be a database system, a file system, an electronic mail system, a media manager, an image manager, or may include any other functions capable of accessing data items. The storage device 1022 may also contain additional software and data (not shown), which is not necessary to understand the invention. Although the controller and the data items 1034 are shown to be within the storage device 1022 in the computer 1010, some or all of them may be distributed across other systems, for example on a server and accessed via the network 1014.


The output device 1024 is that part of the computer 1010 that displays output to the user. The output device 1024 may be a liquid crystal display (LCD) well-known in the art of computer hardware. In other embodiments, the output device 1024 may be replaced with a gas or plasma-based flat-panel display or a traditional cathode-ray tube (CRT) display. In still other embodiments, any appropriate display device may be used. Although only one output device 1024 is shown, in other embodiments any number of output devices of different types, or of the same type, may be present. In an embodiment, the output device 1024 displays a user interface. The input device 1026 may be a keyboard, mouse or other pointing device, trackball, touchpad, touch screen, keypad, microphone, voice recognition device, or any other appropriate mechanism for the user to input data to the computer 1010 and manipulate the user interface previously discussed. Although only one input device 1026 is shown, in another embodiment any number and type of input devices may be present.


The network interface device 1028 provides connectivity from the computer 1010 to the network 1014 through any suitable communications protocol. The network interface device 1028 sends and receives data items from the network 1014. The bus 1030 may represent one or more busses, e.g., USB, PCI, ISA (Industry Standard Architecture), X-Bus, EISA (Extended Industry Standard Architecture), or any other appropriate bus and/or bridge (also called a bus controller).


The computer 1010 may be implemented using any suitable hardware and/or software, such as a personal computer or other electronic computing device. Portable computers, laptop or notebook computers, PDAs, pocket computers, appliances, telephones, and mainframe computers are examples of other possible configurations of the computer 1010. For example, other peripheral devices such as audio adapters or chip programming devices, such as EPROM (Erasable Programmable Read-Only Memory) programming devices may be used in addition to, or in place of, the hardware already depicted.


The network 1014 may be any suitable network and may support any appropriate protocol suitable for communication to the computer 1010. In an embodiment, the network 1014 may support wireless communications. In another embodiment, the network 1014 may support hard-wired communications, such as a telephone line or cable. In another embodiment, the network 1014 may support the Ethernet IEEE (Institute of Electrical and Electronics Engineers) 802.3x specification. In another embodiment, the network 1014 may be the Internet and may support IP (Internet Protocol). In another embodiment the network 1014 may be a LAN or a WAN. In another embodiment, the network 1014 may be a hotspot service provider network. In another embodiment, the network 1014 may be an intranet. In another embodiment, the network 1014 may be a GPRS (General Packet Radio Service) network. In another embodiment, the network 1014 may be any appropriate cellular data network or cell-based radio network technology. In another embodiment, the network 1014 may be an IEEE 802.11 wireless network. In still another embodiment, the network 1014 may be any suitable network or combination of networks. Although one network 1014 is shown, in other embodiments any number of networks (of the same or different types) may be present.


It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or use the processes described in connection with the presently disclosed subject matter, e.g., through the use of an API, reusable controls, or the like. Such programs may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.


Although exemplary embodiments may refer to using aspects of the presently disclosed subject matter in the context of one or more stand-alone computer systems, the subject matter is not so limited, but rather may be implemented in connection with any computing environment, such as a network or distributed computing environment. Still further, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may similarly be effected across a plurality of devices. Such devices might include personal computers, network servers, and handheld devices, for example. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims
  • 1. A non-transitory computer-readable medium comprising computer-readable instructions for depositing a check that, when executed by a processor, cause the processor to: display an image of the check in a field of view of a camera of a mobile device;assess an image criterion of the displayed image with an image assessment software object stored on the mobile device;automatically capture the image of the check with the camera, without user intervention, when the image of the check passes the image criterion; andprovide a digital file of the captured image of the check from the camera to a depository via a communication pathway between the mobile device and the depository.
  • 2. The non-transitory computer-readable medium of claim 1, further comprising instructions that provide feedback, via the mobile device to a user of the mobile device, regarding the image of the check with respect to the image criterion prior to capturing the image of the check.
  • 3. The non-transitory computer-readable medium of claim 2, wherein the feedback is provided if the image fails to pass the image criterion.
  • 4. The non-transitory computer-readable medium of claim 3, wherein the feedback comprises instructions for the user to follow to modify the displayed image of the check and obtain a second image of the check in the field of view of the camera that passes the image criterion.
  • 5. The non-transitory computer-readable medium of claim 4, wherein the feedback is displayed visually in the field of view of the camera.
  • 6. The non-transitory computer-readable medium of claim 4, wherein the feedback comprises alphanumeric instructions.
  • 7. The non-transitory computer-readable medium of claim 4, wherein the feedback comprises a non-alphanumeric visual indicator.
  • 8. The non-transitory computer-readable medium of claim 1, where the computer-readable instructions cause the processor to assess the image of the check in the field of view of the camera with respect to an image criterion without providing feedback, via the mobile device, regarding the assessment of the image of the check in the field of view of the camera.
  • 9. A non-transitory computer-readable medium comprising computer-readable instructions for depositing a check that, when executed by a processor, cause the processor to: initialize a first software object on a mobile device operated by a user, the first software object configured to communicate with a camera;display on a display of the mobile device an image of a check in a field of view of the camera;assess the image of the check in a field of view of the camera with respect to an image criterion using a second software object associated with the mobile device;automatically capture and store the image of the check in a digital image file using the camera when the image of the check in the field of view passes the image criterion; andtransmit the image of the check from the mobile device to a deposit system configured to clear the check and deposit funds of the check into a deposit account of the user.
  • 10. The non-transitory computer-readable medium of claim 9, further comprising instructions that provide feedback, via the mobile device to the user, prior to storing the image of the check in the digital image file.
  • 11. The non-transitory computer-readable medium of claim 10, wherein the feedback comprises instructions to the user to adjust a position of the check relative to the camera.
  • 12. The non-transitory computer-readable medium of claim 9, wherein the image criterion comprises light contrast or light brightness of the image.
  • 13. The non-transitory computer-readable medium of claim 9, wherein the image criterion comprises skewing of the image or warping of the image.
  • 14. A server for depositing a check, comprising: a network interface configured to receive a request for access to a deposit system from a mobile device; anda processor configured to: receive, from the mobile device via the network interface, an image of the check in a field of view of a camera associated with the mobile device;assess the image with respect to an image criterion;transmit instructions from the server to the mobile device to automatically create a digital image of the check when the image passes the image criterion;receive the digital image at the server from the mobile device;process the digital image at the server; anddeposit funds of the check into an account associated with the deposit system using the processed digital image.
  • 15. The server of claim 14, wherein the image criterion is based on light in the image.
  • 16. The server of claim 15, wherein processor is configured to assess the image with respect to the image criterion by generating a histogram using the image and using the histogram to determine when the image passes the image criterion.
  • 17. The server of claim 14, wherein the processor of the server is further configured to provide feedback to the mobile device regarding the image with respect to the image criterion prior to the image passing the image criterion.
  • 18. The server of claim 14, wherein the processor is configured to, after receiving the digital image, transmit a query to the mobile device requesting instructions to proceed with depositing funds of the check.
  • 19. The server of claim 14, wherein the mobile device comprises a video source and the image of the check is received at the network interface in a video received from the mobile device.
  • 20. The server of claim 14, wherein the image of the check is a frame of a live video of the check.
CROSS REFERENCE TO RELATED APPLICATION(S)

This application is a continuation of U.S. patent application Ser. No. 16/712,182, filed Dec. 12, 2019, which is a continuation of U.S. patent application Ser. No. 16/280,455, filed on Feb. 20, 2019, which is a continuation of U.S. patent application Ser. No. 15/792,966, filed on Oct. 25, 2017 (issued as U.S. Pat. No. 10,235,660 on Mar. 19, 2019), which is a continuation of U.S. patent application Ser. No. 15/392,950, filed on Dec. 28, 2016 (issued as U.S. Pat. No. 9,818,090 on Nov. 14, 2017), which is a continuation of U.S. patent application Ser. No. 13/922,686, filed Jun. 20, 2013 (issued as U.S. Pat. No. 9,569,756 on Feb. 14, 2017), which is a continuation of U.S. patent application Ser. No. 12/545,127, filed Aug. 21, 2009 (issued as U.S. Pat. No. 8,977,571 on Mar. 10, 2015), all of which are hereby incorporated by reference herein in their entirety.

US Referenced Citations (1041)
Number Name Date Kind
1748489 McCarthy et al. Feb 1930 A
2292825 Dilks et al. Aug 1942 A
3005282 Christiansen Oct 1961 A
3341820 Grillmeier, Jr. et al. Sep 1967 A
3576972 Wood May 1971 A
3593913 Bremer Jul 1971 A
3620553 Donovan Nov 1971 A
3648242 Grosbard Mar 1972 A
3800124 Walsh Mar 1974 A
3816943 Henry Jun 1974 A
4002356 Weidmann Jan 1977 A
4027142 Paup et al. May 1977 A
4060711 Buros Nov 1977 A
4070649 Wright, Jr. et al. Jan 1978 A
4128202 Buros Dec 1978 A
4136471 Austin Jan 1979 A
4205780 Burns Jun 1980 A
4264808 Owens Apr 1981 A
4305216 Skelton Dec 1981 A
4321672 Braun Mar 1982 A
4346442 Musmanno Aug 1982 A
4417136 Rushby et al. Nov 1983 A
4433436 Carnes Feb 1984 A
4454610 Sziklai Jun 1984 A
RE31692 Tyburski et al. Oct 1984 E
4523330 Cain Jun 1985 A
4636099 Goldston Jan 1987 A
4640413 Kaplan Feb 1987 A
4644144 Chandek Feb 1987 A
4722444 Murphy et al. Feb 1988 A
4722544 Weber Feb 1988 A
4727435 Otani et al. Feb 1988 A
4737911 Freeman Apr 1988 A
4774663 Musmanno Sep 1988 A
4790475 Griffin Dec 1988 A
4806780 Yamamoto Feb 1989 A
4837693 Schotz Jun 1989 A
4890228 Longfield Dec 1989 A
4896363 Taylor et al. Jan 1990 A
4927071 Wood May 1990 A
4934587 McNabb Jun 1990 A
4960981 Benton Oct 1990 A
4975735 Bright Dec 1990 A
5022683 Barbour Jun 1991 A
5053607 Carlson Oct 1991 A
5077805 Tan Dec 1991 A
5091968 Higgins et al. Feb 1992 A
5122950 Benton et al. Jun 1992 A
5134564 Dunn et al. Jul 1992 A
5146606 Grondalski Sep 1992 A
5157620 Shaar Oct 1992 A
5159548 Caslavka Oct 1992 A
5164833 Aoki Nov 1992 A
5175682 Higashiyama et al. Dec 1992 A
5187750 Behera Feb 1993 A
5191525 LeBrun Mar 1993 A
5193121 Elischer et al. Mar 1993 A
5220501 Lawlor Jun 1993 A
5227863 Bilbrey et al. Jul 1993 A
5229589 Schneider Jul 1993 A
5233547 Kapp et al. Aug 1993 A
5237158 Kern et al. Aug 1993 A
5237159 Stephens Aug 1993 A
5237620 Deaton et al. Aug 1993 A
5257320 Etherington et al. Oct 1993 A
5265008 Benton Nov 1993 A
5268968 Yoshida Dec 1993 A
5283829 Anderson Feb 1994 A
5321816 Rogan Jun 1994 A
5345090 Hludzinski Sep 1994 A
5347302 Simonoff Sep 1994 A
5350906 Brody Sep 1994 A
5373550 Campbell Dec 1994 A
5383113 Kight et al. Jan 1995 A
5419588 Wood May 1995 A
5422467 Graef Jun 1995 A
5444616 Nair et al. Aug 1995 A
5444794 Uhland, Sr. Aug 1995 A
5455875 Chevion et al. Oct 1995 A
5475403 Havlovick et al. Dec 1995 A
5504538 Tsujihara Apr 1996 A
5504677 Pollin Apr 1996 A
5528387 Kelly et al. Jun 1996 A
5530773 Thompson Jun 1996 A
5577179 Blank Nov 1996 A
5583759 Geer Dec 1996 A
5590196 Moreau Dec 1996 A
5594225 Botvin Jan 1997 A
5598969 Ong Feb 1997 A
5602936 Green Feb 1997 A
5610726 Nonoshita Mar 1997 A
5611028 Shibasaki Mar 1997 A
5630073 Nolan May 1997 A
5631984 Graf et al. May 1997 A
5668897 Stolfo Sep 1997 A
5673320 Ray et al. Sep 1997 A
5677955 Doggett Oct 1997 A
5678046 Cahill et al. Oct 1997 A
5679938 Templeton Oct 1997 A
5680611 Rail Oct 1997 A
5691524 Josephson Nov 1997 A
5699452 Vaidyanathan Dec 1997 A
5734747 Vaidyanathan Mar 1998 A
5737440 Kunkler Apr 1998 A
5748780 Stolfo May 1998 A
5751842 Riach May 1998 A
5784503 Bleecker, III et al. Jul 1998 A
5830609 Warner Nov 1998 A
5832463 Funk Nov 1998 A
5838814 Moore Nov 1998 A
5859935 Johnson et al. Jan 1999 A
5863075 Rich Jan 1999 A
5870456 Rogers Feb 1999 A
5870724 Lawlor Feb 1999 A
5870725 Bellinger et al. Feb 1999 A
5878337 Joao Mar 1999 A
5893101 Balogh et al. Apr 1999 A
5897625 Gustin Apr 1999 A
5898157 Mangili et al. Apr 1999 A
5901253 Tretter May 1999 A
5903878 Talati May 1999 A
5903881 Schrader May 1999 A
5903904 Peairs May 1999 A
5910988 Ballard Jun 1999 A
5917931 Kunkier Jun 1999 A
5924737 Schrupp Jul 1999 A
5926548 Okamoto Jul 1999 A
5930501 Neil Jul 1999 A
5930778 Geer Jul 1999 A
5937396 Konya Aug 1999 A
5940844 Cahill Aug 1999 A
5982918 Mennie Nov 1999 A
5987439 Gustin et al. Nov 1999 A
6005623 Takahashi Dec 1999 A
6012048 Gustin et al. Jan 2000 A
6014454 Kunkler Jan 2000 A
6021202 Anderson Feb 2000 A
6021397 Jones Feb 2000 A
6023705 Bellinger et al. Feb 2000 A
6029887 Furuhashi Feb 2000 A
6030000 Diamond Feb 2000 A
6032137 Ballard Feb 2000 A
6038553 Hyde Mar 2000 A
6044883 Noyes Apr 2000 A
6053405 Irwin, Jr. et al. Apr 2000 A
6059185 Funk et al. May 2000 A
6064753 Bolle et al. May 2000 A
6072941 Suzuki et al. Jun 2000 A
6073119 Borenmisza-Wahr Jun 2000 A
6085168 Mori Jul 2000 A
6097834 Krouse Aug 2000 A
6097845 Ng et al. Aug 2000 A
6097885 Rayner Aug 2000 A
6105865 Hardesty Aug 2000 A
6128603 Dent et al. Oct 2000 A
6141339 Kaplan et al. Oct 2000 A
6145738 Stinson et al. Nov 2000 A
6148102 Stolin Nov 2000 A
6149056 Stinson et al. Nov 2000 A
6151409 Chen et al. Nov 2000 A
6151423 Melen Nov 2000 A
6151426 Lee Nov 2000 A
6159585 Rittenhouse Dec 2000 A
6170744 Lee Jan 2001 B1
6178270 Taylor et al. Jan 2001 B1
6181837 Cahill et al. Jan 2001 B1
6188506 Kaiserman Feb 2001 B1
6189785 Lowery Feb 2001 B1
6192165 Irons Feb 2001 B1
6195452 Royer Feb 2001 B1
6195694 Chen et al. Feb 2001 B1
6199055 Kara Mar 2001 B1
6236009 Emigh et al. May 2001 B1
6243689 Norton Jun 2001 B1
6278983 Ball Aug 2001 B1
6282826 Richards Sep 2001 B1
6293469 Masson et al. Sep 2001 B1
6304860 Martin Oct 2001 B1
6310647 Parulski et al. Oct 2001 B1
6314452 Dekel Nov 2001 B1
6315195 Ramachandran Nov 2001 B1
6317727 May Nov 2001 B1
6328207 Gregoire et al. Dec 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6339658 Moccagatta Jan 2002 B1
6339766 Gephart Jan 2002 B1
6351553 Hayosh Feb 2002 B1
6351735 Deaton et al. Feb 2002 B1
6354490 Weiss et al. Mar 2002 B1
6363164 Jones et al. Mar 2002 B1
6390362 Martin May 2002 B1
6397196 Kravetz May 2002 B1
6408084 Foley Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6411737 Wesolkowski et al. Jun 2002 B2
6411938 Gates et al. Jun 2002 B1
6413305 Mehta Jul 2002 B1
6417869 Do Jul 2002 B1
6425017 Dievendorff Jul 2002 B1
6429952 Olbricht Aug 2002 B1
6439454 Masson et al. Aug 2002 B1
6449397 Che-chu Sep 2002 B1
6450403 Martens et al. Sep 2002 B1
6463220 Dance Oct 2002 B1
6464134 Page Oct 2002 B1
6469745 Yamada et al. Oct 2002 B1
6470325 Leemhuis Oct 2002 B1
6473519 Pidhirny et al. Oct 2002 B1
6502747 Stoutenburg et al. Jan 2003 B1
6505178 Flenley Jan 2003 B1
6546119 Ciolli et al. Apr 2003 B2
6574377 Cahill et al. Jun 2003 B1
6574609 Downs Jun 2003 B1
6578760 Otto Jun 2003 B1
6587837 Spagna Jul 2003 B1
6606117 Windle Aug 2003 B1
6609200 Anderson Aug 2003 B2
6611598 Hayosh Aug 2003 B1
6614930 Agnihotri et al. Sep 2003 B1
6643416 Daniels Nov 2003 B1
6647136 Jones et al. Nov 2003 B2
6654487 Downs, Jr. Nov 2003 B1
6661910 Jones et al. Dec 2003 B2
6668372 Wu Dec 2003 B1
6672452 Alves Jan 2004 B1
6682452 Quintus Jan 2004 B2
6695204 Stinson Feb 2004 B1
6697091 Rzepkowski et al. Feb 2004 B1
6711474 Treyz et al. Mar 2004 B1
6726097 Graef Apr 2004 B2
6728397 Mcneal Apr 2004 B2
6738496 Van Hall May 2004 B1
6742128 Joiner May 2004 B1
6745186 Testa et al. Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6755340 Voss Jun 2004 B1
6760414 Schurko et al. Jul 2004 B1
6760470 Bogosian et al. Jul 2004 B1
6763226 McZeal Jul 2004 B1
6781962 Williams Aug 2004 B1
6786398 Stinson et al. Sep 2004 B1
6789054 Makhlouf Sep 2004 B1
6796491 Nakajima Sep 2004 B2
6806903 Okisu et al. Oct 2004 B1
6807294 Yamazaki Oct 2004 B2
6813733 Li Nov 2004 B1
6829704 Zhang Dec 2004 B2
6844885 Anderson Jan 2005 B2
6856965 Stinson Feb 2005 B1
6863214 Garner et al. Mar 2005 B2
6870947 Kelland Mar 2005 B2
6873728 Bernstein et al. Mar 2005 B2
6883140 Acker Apr 2005 B1
6898314 Kung et al. May 2005 B2
6902105 Koakutsu Jun 2005 B2
6913188 Wong Jul 2005 B2
6922487 Dance et al. Jul 2005 B2
6931255 Mekuria Aug 2005 B2
6931591 Brown Aug 2005 B1
6934719 Nally Aug 2005 B2
6947610 Sun Sep 2005 B2
6957770 Robinson Oct 2005 B1
6961689 Greenberg Nov 2005 B1
6970843 Forte Nov 2005 B1
6973589 Wright Dec 2005 B2
6983886 Natsukari et al. Jan 2006 B2
6993507 Meyer Jan 2006 B2
6996263 Jones et al. Feb 2006 B2
6999943 Johnson Feb 2006 B1
7003040 Yi Feb 2006 B2
7004382 Sandru Feb 2006 B2
7010155 Koakutsu et al. Mar 2006 B2
7010507 Anderson Mar 2006 B1
7016704 Pallakoff Mar 2006 B2
7027171 Watanabe Apr 2006 B1
7028886 Maloney Apr 2006 B1
7039048 Monta May 2006 B1
7046991 Little May 2006 B2
7051001 Slater May 2006 B1
7058036 Yu Jun 2006 B1
7062099 Li et al. Jun 2006 B2
7062456 Riehl et al. Jun 2006 B1
7062768 Kubo Jun 2006 B2
7072862 Wilson Jul 2006 B1
7076458 Lawlor et al. Jul 2006 B2
7086003 Demsky Aug 2006 B2
7092561 Downs, Jr. Aug 2006 B2
7104443 Paul et al. Sep 2006 B1
7113925 Waserstein Sep 2006 B2
7114649 Nelson Oct 2006 B2
7116446 Maurer Oct 2006 B2
7117171 Pollin Oct 2006 B1
7120461 Cho Oct 2006 B2
7139594 Nagatomo Nov 2006 B2
7140539 Crews Nov 2006 B1
7163347 Lugg Jan 2007 B2
7178721 Maloney Feb 2007 B2
7181430 Buchanan et al. Feb 2007 B1
7184980 Allen-Rouman et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7200255 Jones Apr 2007 B2
7204412 Foss, Jr. Apr 2007 B2
7207478 Blackson et al. Apr 2007 B1
7216106 Buchanan May 2007 B1
7219082 Forte May 2007 B2
7219831 Murata May 2007 B2
7240336 Baker Jul 2007 B1
7245765 Myers et al. Jul 2007 B2
7249076 Pendleton Jul 2007 B1
7252224 Verma Aug 2007 B2
7257246 Brodie et al. Aug 2007 B1
7266230 Doran Sep 2007 B2
7290034 Budd Oct 2007 B2
7299970 Ching Nov 2007 B1
7299979 Phillips Nov 2007 B2
7313543 Crane Dec 2007 B1
7314163 Crews et al. Jan 2008 B1
7321874 Dilip Jan 2008 B2
7321875 Dilip Jan 2008 B2
7325725 Foss, Jr. Feb 2008 B2
7328190 Smith et al. Feb 2008 B2
7330604 Wu et al. Feb 2008 B2
7331523 Meier et al. Feb 2008 B2
7336813 Prakash et al. Feb 2008 B2
7343320 Treyz Mar 2008 B1
7349566 Jones et al. Mar 2008 B2
7349585 Li Mar 2008 B2
7356505 March Apr 2008 B2
7369713 Suino May 2008 B2
7377425 Ma May 2008 B1
7379978 Anderson May 2008 B2
7385631 Maeno Jun 2008 B2
7386511 Buchanan Jun 2008 B2
7388683 Rodriguez et al. Jun 2008 B2
7389912 Starrs Jun 2008 B2
7391897 Jones Jun 2008 B2
7391934 Goodall et al. Jun 2008 B2
7392935 Byrne Jul 2008 B2
7401048 Rosedale Jul 2008 B2
7403917 Larsen Jul 2008 B1
7406198 Aoki et al. Jul 2008 B2
7419093 Blackson et al. Sep 2008 B1
7421107 Lugg Sep 2008 B2
7421410 Schechtman et al. Sep 2008 B1
7427016 Chimento Sep 2008 B2
7433098 Klein et al. Oct 2008 B2
7437327 Lam Oct 2008 B2
7440924 Buchanan Oct 2008 B2
7447347 Weber Nov 2008 B2
7455220 Phillips Nov 2008 B2
7455221 Sheaffer Nov 2008 B2
7460108 Tamura Dec 2008 B2
7460700 Tsunachima et al. Dec 2008 B2
7461779 Ramachandran Dec 2008 B2
7461780 Potts Dec 2008 B2
7471818 Price Dec 2008 B1
7475040 Buchanan Jan 2009 B2
7477923 Wallmark Jan 2009 B2
7480382 Dunbar Jan 2009 B2
7480422 Ackley et al. Jan 2009 B2
7489953 Griffin Feb 2009 B2
7490242 Torres Feb 2009 B2
7497429 Reynders Mar 2009 B2
7503486 Ahles Mar 2009 B2
7505759 Rahman Mar 2009 B1
7506261 Statou Mar 2009 B2
7509287 Nutahara Mar 2009 B2
7512564 Geer Mar 2009 B1
7519560 Lam Apr 2009 B2
7520420 Phillips Apr 2009 B2
7520422 Robinson et al. Apr 2009 B1
7536354 deGroeve et al. May 2009 B1
7536440 Budd May 2009 B2
7539646 Gilder May 2009 B2
7540408 Levine Jun 2009 B2
7542598 Jones Jun 2009 B2
7545529 Borrey et al. Jun 2009 B2
7548641 Gilson et al. Jun 2009 B2
7566002 Love et al. Jul 2009 B2
7571848 Cohen Aug 2009 B2
7577614 Warren et al. Aug 2009 B1
7587066 Cordery et al. Sep 2009 B2
7587363 Cataline Sep 2009 B2
7590275 Clarke et al. Sep 2009 B2
7599543 Jones Oct 2009 B2
7599888 Manfre Oct 2009 B2
7602956 Jones Oct 2009 B2
7606762 Heit Oct 2009 B1
7609873 Foth et al. Oct 2009 B2
7619721 Jones Nov 2009 B2
7620231 Jones Nov 2009 B2
7620604 Bueche, Jr. Nov 2009 B1
7630518 Frew et al. Dec 2009 B2
7644037 Ostrovsky Jan 2010 B1
7644043 Minowa Jan 2010 B2
7647275 Jones Jan 2010 B2
7668363 Price Feb 2010 B2
7672940 Viola Mar 2010 B2
7676409 Ahmad Mar 2010 B1
7680732 Davies et al. Mar 2010 B1
7680735 Loy Mar 2010 B1
7689482 Lam Mar 2010 B2
7697776 Wu et al. Apr 2010 B2
7698222 Bueche, Jr. Apr 2010 B1
7702588 Gilder et al. Apr 2010 B2
7734545 Fogliano Jun 2010 B1
7743979 Fredman Jun 2010 B2
7753268 Robinson et al. Jul 2010 B1
7761358 Craig et al. Jul 2010 B2
7766244 Field Aug 2010 B1
7769650 Bleunven Aug 2010 B2
7792752 Kay Sep 2010 B1
7792753 Slater et al. Sep 2010 B1
7793833 Yoon et al. Sep 2010 B2
7810714 Murata Oct 2010 B2
7812986 Graham et al. Oct 2010 B2
7818245 Prakash et al. Oct 2010 B2
7856402 Kay Dec 2010 B1
7873200 Oakes, III et al. Jan 2011 B1
7876949 Oakes, III et al. Jan 2011 B1
7885451 Walls et al. Feb 2011 B1
7885880 Prasad et al. Feb 2011 B1
7894094 Nacman et al. Feb 2011 B2
7896232 Prasad et al. Mar 2011 B1
7900822 Prasad et al. Mar 2011 B1
7903863 Jones et al. Mar 2011 B2
7904386 Kalra et al. Mar 2011 B2
7912785 Kay Mar 2011 B1
7935441 Tononishi May 2011 B2
7949587 Morris et al. May 2011 B1
7950698 Popadic et al. May 2011 B2
7953441 Lors May 2011 B2
7962411 Prasad et al. Jun 2011 B1
7970677 Oakes, III et al. Jun 2011 B1
7974899 Prasad et al. Jul 2011 B1
7978900 Nepomniachtchi et al. Jul 2011 B2
7987231 Karkanias Jul 2011 B2
7996312 Beck et al. Aug 2011 B1
7996314 Smith et al. Aug 2011 B1
7996315 Smith et al. Aug 2011 B1
7996316 Smith et al. Aug 2011 B1
8000514 Nepomniachtchi et al. Aug 2011 B2
8001051 Smith et al. Aug 2011 B1
8045784 Price et al. Oct 2011 B2
8046301 Smith et al. Oct 2011 B1
8060442 Hecht et al. Nov 2011 B1
8116533 Kiplinger et al. Feb 2012 B2
8203640 Kim et al. Jun 2012 B2
8204293 Csulits et al. Jun 2012 B2
8235284 Prasad et al. Aug 2012 B1
8320657 Burks et al. Nov 2012 B1
8341077 Nichols et al. Dec 2012 B1
8351678 Medina, III Jan 2013 B1
8358826 Medina, III Jan 2013 B1
8369650 Zanfir et al. Feb 2013 B2
8374963 Billman Feb 2013 B1
8391599 Medina, III Mar 2013 B1
8392332 Oakes, III et al. Mar 2013 B1
8401962 Bent et al. Mar 2013 B1
8422758 Bueche, Jr. Apr 2013 B1
8433127 Harpel et al. Apr 2013 B1
8433647 Yarbrough Apr 2013 B1
8452689 Medina, III May 2013 B1
8464933 Prasad et al. Jun 2013 B1
8531518 Zomet Sep 2013 B1
8538124 Harpel et al. Sep 2013 B1
8542921 Medina Sep 2013 B1
8548267 Yacoub et al. Oct 2013 B1
8559766 Tilt et al. Oct 2013 B2
8582862 Nepomniachtchi et al. Nov 2013 B2
8611635 Medina, III Dec 2013 B1
8660952 Viera et al. Feb 2014 B1
8708227 Oakes, III et al. Apr 2014 B1
8731321 Fujiwara et al. May 2014 B2
8732081 Oakes, III et al. May 2014 B1
8751345 Borzyche et al. Jun 2014 B1
8751356 Garcia Jun 2014 B1
8751379 Bueche, Jr. Jun 2014 B1
8768038 Sherman et al. Jul 2014 B1
8768836 Acharya Jul 2014 B1
8799147 Walls et al. Aug 2014 B1
8818033 Liu Aug 2014 B1
8824772 Viera Sep 2014 B2
8837806 Ethington et al. Sep 2014 B1
8843405 Hartman et al. Sep 2014 B1
8929640 Mennie et al. Jan 2015 B1
8950033 Oakes, III et al. Feb 2015 B1
8977571 Bueche, Jr. Mar 2015 B1
8990862 Smith Mar 2015 B1
9009071 Watson et al. Apr 2015 B1
9036040 Danko May 2015 B1
9058512 Medina, III Jun 2015 B1
9064284 Janiszeski et al. Jun 2015 B1
9129340 Medina, III et al. Aug 2015 B1
9159101 Pollack et al. Oct 2015 B1
9177197 Prasad et al. Nov 2015 B1
9177198 Prasad et al. Nov 2015 B1
9224136 Oakes, III et al. Dec 2015 B1
9235860 Boucher et al. Jan 2016 B1
9270804 Dees et al. Feb 2016 B2
9286514 Newman Mar 2016 B1
9311634 Hildebrand Apr 2016 B1
9336517 Prasad et al. May 2016 B1
9384409 Ming Jul 2016 B1
9390339 Danko Jul 2016 B1
9401011 Medina, III et al. Jul 2016 B2
9424569 Sherman et al. Aug 2016 B1
9569756 Bueche, Jr. Feb 2017 B1
9613467 Roberts et al. Apr 2017 B2
9613469 Fish et al. Apr 2017 B2
9619872 Medina, III et al. Apr 2017 B1
9626183 Smith et al. Apr 2017 B1
9626662 Prasad et al. Apr 2017 B1
9779392 Prasad et al. Oct 2017 B1
9779452 Medina et al. Oct 2017 B1
9785929 Watson et al. Oct 2017 B1
9792654 Limas et al. Oct 2017 B1
9818090 Bueche, Jr. Nov 2017 B1
9824453 Collins et al. Nov 2017 B1
9872454 Pollack et al. Feb 2018 B2
9886642 Danko Feb 2018 B1
9898778 Pollack et al. Feb 2018 B1
9898808 Medina, III et al. Feb 2018 B1
9904848 Newman Feb 2018 B1
9946923 Medina Apr 2018 B1
10013605 Oakes, III et al. Jul 2018 B1
10013681 Oakes, III et al. Jul 2018 B1
10157326 Long et al. Dec 2018 B2
10181087 Danko Jan 2019 B1
10235660 Bueche, Jr. et al. Mar 2019 B1
10325420 Moon Jun 2019 B1
10354235 Medina Jul 2019 B1
10360448 Newman Jul 2019 B1
10373136 Pollack et al. Aug 2019 B1
10380559 Oakes, III et al. Aug 2019 B1
10380562 Prasad et al. Aug 2019 B1
10380565 Prasad Aug 2019 B1
10380683 Voutour et al. Aug 2019 B1
10380993 Clauer Salyers Aug 2019 B1
10402638 Oaks, III et al. Sep 2019 B1
10402790 Clark et al. Sep 2019 B1
10574879 Prasad et al. Feb 2020 B1
10621559 Oakes, III et al. Apr 2020 B1
10621660 Medina et al. Apr 2020 B1
10706466 Ethington et al. Jul 2020 B1
10713629 Medina, III Jul 2020 B1
10719815 Oakes, III et al. Jul 2020 B1
10769598 Oakes, III et al. Sep 2020 B1
10818282 Clauer Salyers Oct 2020 B1
10956879 Eidson Mar 2021 B1
11030752 Backlund Jun 2021 B1
11042940 Limas Jun 2021 B1
11042941 Limas Jun 2021 B1
11062130 Medina, III Jul 2021 B1
11062131 Medina, III Jul 2021 B1
11062283 Prasad Jul 2021 B1
11064111 Prasad Jul 2021 B1
11068976 Voutour Jul 2021 B1
11070868 Mortensen Jul 2021 B1
11222315 Prasad et al. Jan 2022 B1
11232517 Medina et al. Jan 2022 B1
20010004235 Maloney Jun 2001 A1
20010014881 Drummond Aug 2001 A1
20010016084 Pollard et al. Aug 2001 A1
20010018739 Anderson Aug 2001 A1
20010027994 Hayashida Oct 2001 A1
20010030695 Prabhu et al. Oct 2001 A1
20010037299 Nichols et al. Nov 2001 A1
20010042171 Vermeulen Nov 2001 A1
20010042785 Walker Nov 2001 A1
20010043748 Wesolkowski et al. Nov 2001 A1
20010047330 Gephart Nov 2001 A1
20010051965 Guillevic Dec 2001 A1
20010054020 Barth et al. Dec 2001 A1
20020001393 Jones Jan 2002 A1
20020013767 Katz Jan 2002 A1
20020169715 Ruth et al. Jan 2002 A1
20020016763 March Feb 2002 A1
20020016769 Barbara et al. Feb 2002 A1
20020023055 Antognini et al. Feb 2002 A1
20020032656 Chen Mar 2002 A1
20020038289 Lawlor et al. Mar 2002 A1
20020052841 Guthrie May 2002 A1
20020052853 Munoz May 2002 A1
20020065786 Martens et al. May 2002 A1
20020072974 Pugliese Jun 2002 A1
20020075524 Blair Jun 2002 A1
20020084321 Martens Jul 2002 A1
20020087467 Mascavage, III et al. Jul 2002 A1
20020107809 Biddle et al. Aug 2002 A1
20020116329 Serbetcioglu Aug 2002 A1
20020116335 Star Aug 2002 A1
20020118891 Rudd Aug 2002 A1
20020120562 Opiela Aug 2002 A1
20020129249 Maillard et al. Sep 2002 A1
20020133409 Sawano et al. Sep 2002 A1
20020138445 Laage Sep 2002 A1
20020138522 Muralidhar Sep 2002 A1
20020145035 Jones Oct 2002 A1
20020147798 Huang Oct 2002 A1
20020150279 Scott Oct 2002 A1
20020152160 Allen-Rouman et al. Oct 2002 A1
20020152161 Aoike Oct 2002 A1
20020152164 Dutta Oct 2002 A1
20020152165 Dutta et al. Oct 2002 A1
20020152169 Dutta et al. Oct 2002 A1
20020152170 Dutta Oct 2002 A1
20020153414 Stoutenburg et al. Oct 2002 A1
20020154815 Mizutani Oct 2002 A1
20020159648 Alderson et al. Oct 2002 A1
20020171820 Okamura Nov 2002 A1
20020172516 Aoyama Nov 2002 A1
20020178112 Goeller Nov 2002 A1
20020186881 Li Dec 2002 A1
20020188564 Star Dec 2002 A1
20020195485 Pomerleau et al. Dec 2002 A1
20030005326 Flemming Jan 2003 A1
20030018897 Bellis, Jr. et al. Jan 2003 A1
20030023557 Moore Jan 2003 A1
20030026609 Parulski Feb 2003 A1
20030038227 Sesek Feb 2003 A1
20030051138 Maeda et al. Mar 2003 A1
20030055756 Allan Mar 2003 A1
20030055776 Samuelson Mar 2003 A1
20030072568 Lin et al. Apr 2003 A1
20030074315 Lam Apr 2003 A1
20030075596 Koakutsu Apr 2003 A1
20030075916 Gorski Apr 2003 A1
20030078883 Stewart et al. Apr 2003 A1
20030081824 Mennie May 2003 A1
20030086615 Dance et al. May 2003 A1
20030093367 Allen-Rouman et al. May 2003 A1
20030093369 Ijichi et al. May 2003 A1
20030097592 Adusumilli May 2003 A1
20030102714 Rhodes et al. Jun 2003 A1
20030105688 Brown Jun 2003 A1
20030105714 Alarcon-Luther et al. Jun 2003 A1
20030132384 Sugiyama et al. Jul 2003 A1
20030133608 Bernstein et al. Jul 2003 A1
20030135457 Stewart et al. Jul 2003 A1
20030139999 Rowe Jul 2003 A1
20030159046 Choi et al. Aug 2003 A1
20030167225 Adams Sep 2003 A1
20030177448 Levine et al. Sep 2003 A1
20030187790 Swift et al. Oct 2003 A1
20030191615 Bailey Oct 2003 A1
20030191869 Williams Oct 2003 A1
20030200107 Allen et al. Oct 2003 A1
20030200174 Star Oct 2003 A1
20030212904 Randle et al. Nov 2003 A1
20030213841 Josephson et al. Nov 2003 A1
20030217005 Drummond et al. Nov 2003 A1
20030218061 Filatov Nov 2003 A1
20030225705 Park et al. Dec 2003 A1
20030231285 Ferguson Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20030233318 King et al. Dec 2003 A1
20040010466 Anderson Jan 2004 A1
20040010803 Berstis Jan 2004 A1
20040012496 De Souza Jan 2004 A1
20040013284 Yu Jan 2004 A1
20040017482 Weitman Jan 2004 A1
20040024626 Bruijning Feb 2004 A1
20040024708 Masuda Feb 2004 A1
20040029591 Chapman et al. Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040057697 Renzi Mar 2004 A1
20040058705 Morgan Mar 2004 A1
20040061913 Takiguchi Apr 2004 A1
20040066031 Wong Apr 2004 A1
20040066419 Pyhalammi Apr 2004 A1
20040069841 Wong Apr 2004 A1
20040071333 Douglas et al. Apr 2004 A1
20040075754 Nakajima et al. Apr 2004 A1
20040076320 Downs, Jr. Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040080795 Bean et al. Apr 2004 A1
20040089711 Sandru May 2004 A1
20040093303 Picciallo May 2004 A1
20040093305 Kight May 2004 A1
20040103057 Melbert et al. May 2004 A1
20040103296 Harp May 2004 A1
20040109596 Doran Jun 2004 A1
20040110975 Osinski et al. Jun 2004 A1
20040117302 Weichert Jun 2004 A1
20040122754 Stevens Jun 2004 A1
20040133511 Smith et al. Jul 2004 A1
20040138974 Shimamura Jul 2004 A1
20040148235 Craig et al. Jul 2004 A1
20040158549 Matena Aug 2004 A1
20040165096 Maeno Aug 2004 A1
20040170259 Park Sep 2004 A1
20040171371 Paul Sep 2004 A1
20040201695 Inasaka Oct 2004 A1
20040201741 Ban Oct 2004 A1
20040202349 Erol et al. Oct 2004 A1
20040205459 Green Oct 2004 A1
20040210515 Hughes Oct 2004 A1
20040210523 Gains et al. Oct 2004 A1
20040217170 Takiguchi et al. Nov 2004 A1
20040228277 Williams Nov 2004 A1
20040236647 Acharya Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040238619 Nagasaka et al. Dec 2004 A1
20040240722 Tsuji et al. Dec 2004 A1
20040245324 Chen Dec 2004 A1
20040247199 Murai et al. Dec 2004 A1
20040248600 Kim Dec 2004 A1
20040252679 Williams Dec 2004 A1
20040260636 Marceau Dec 2004 A1
20040267665 Nam et al. Dec 2004 A1
20040267666 Minami Dec 2004 A1
20050001421 Luth et al. Jan 2005 A1
20050010108 Rahn et al. Jan 2005 A1
20050015341 Jackson Jan 2005 A1
20050015342 Murata et al. Jan 2005 A1
20050021466 Buchanan et al. Jan 2005 A1
20050030388 Stavely et al. Feb 2005 A1
20050033645 Duphily Feb 2005 A1
20050033685 Reyes Feb 2005 A1
20050033690 Antognini et al. Feb 2005 A1
20050033695 Minowa Feb 2005 A1
20050034046 Berkmann Feb 2005 A1
20050035193 Gustin et al. Feb 2005 A1
20050038746 Latimer et al. Feb 2005 A1
20050038754 Geist Feb 2005 A1
20050044042 Mendiola Feb 2005 A1
20050044577 Jerding Feb 2005 A1
20050049950 Johnson Mar 2005 A1
20050071283 Randle et al. Mar 2005 A1
20050075969 Nielson et al. Apr 2005 A1
20050075974 Turgeon Apr 2005 A1
20050078336 Ferlitsch Apr 2005 A1
20050080725 Pick Apr 2005 A1
20050082364 Alvarez et al. Apr 2005 A1
20050086140 Ireland Apr 2005 A1
20050086168 Alvarez Apr 2005 A1
20050091161 Gustin Apr 2005 A1
20050096992 Geisei May 2005 A1
20050097019 Jacobs May 2005 A1
20050097046 Singfield May 2005 A1
20050097050 Orcutt May 2005 A1
20050100216 Myers et al. May 2005 A1
20050108164 Salafia May 2005 A1
20050108168 Halpin May 2005 A1
20050115110 Dinkins Jun 2005 A1
20050125338 Tidwell et al. Jun 2005 A1
20050125360 Tidwell et al. Jun 2005 A1
20050127160 Fujikawa Jun 2005 A1
20050131820 Rodriguez Jun 2005 A1
20050143136 Lev et al. Jun 2005 A1
20050144131 Aziz Jun 2005 A1
20050149436 Elterich Jul 2005 A1
20050157174 Kitamura et al. Jul 2005 A1
20050168566 Tada Aug 2005 A1
20050171899 Dunn Aug 2005 A1
20050171907 Lewis Aug 2005 A1
20050177499 Thomas Aug 2005 A1
20050177518 Brown Aug 2005 A1
20050182710 Anderson Aug 2005 A1
20050188306 Mackenzie Aug 2005 A1
20050198364 del Val et al. Sep 2005 A1
20050205660 Munte Sep 2005 A1
20050205661 Taylor Sep 2005 A1
20050209961 Michelsen Sep 2005 A1
20050216409 McMonagle et al. Sep 2005 A1
20050220324 Klein et al. Oct 2005 A1
20050228733 Bent Oct 2005 A1
20050238257 Kaneda et al. Oct 2005 A1
20050252955 Sugai Nov 2005 A1
20050267843 Acharya et al. Dec 2005 A1
20050268107 Harris et al. Dec 2005 A1
20050269412 Chiu Dec 2005 A1
20050273368 Hutten et al. Dec 2005 A1
20050278250 Zair Dec 2005 A1
20050281448 Lugg Dec 2005 A1
20050281450 Richardson Dec 2005 A1
20050281471 LeConte Dec 2005 A1
20050281474 Huang Dec 2005 A1
20050289030 Smith Dec 2005 A1
20050289059 Brewington et al. Dec 2005 A1
20050289182 Pandian et al. Dec 2005 A1
20060002426 Madour Jan 2006 A1
20060004660 Pranger Jan 2006 A1
20060015450 Guck et al. Jan 2006 A1
20060017752 Kurzweil et al. Jan 2006 A1
20060025697 Kurzweil Feb 2006 A1
20060039628 Li et al. Feb 2006 A1
20060039629 Li Feb 2006 A1
20060041506 Mason et al. Feb 2006 A1
20060045321 Yu Mar 2006 A1
20060045374 Kim et al. Mar 2006 A1
20060045379 Heaney, Jr. et al. Mar 2006 A1
20060047593 Naratil Mar 2006 A1
20060053056 Alspach-Goss Mar 2006 A1
20060059085 Tucker Mar 2006 A1
20060064368 Forte Mar 2006 A1
20060071950 Kurzweil et al. Apr 2006 A1
20060077941 Alagappan et al. Apr 2006 A1
20060080245 Bahl Apr 2006 A1
20060085357 Pizarro Apr 2006 A1
20060085516 Farr et al. Apr 2006 A1
20060102704 Reynders May 2006 A1
20060103893 Azimi et al. May 2006 A1
20060106691 Sheaffer May 2006 A1
20060106717 Randle May 2006 A1
20060110063 Weiss May 2006 A1
20060112013 Maloney May 2006 A1
20060115110 Rodriguez Jun 2006 A1
20060115141 Koakutsu et al. Jun 2006 A1
20060118613 McMann Jun 2006 A1
20060124730 Maloney Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060144950 Johnson Jul 2006 A1
20060161501 Waserstein Jul 2006 A1
20060164682 Lev Jul 2006 A1
20060166178 Driedijk Jul 2006 A1
20060167818 Wentker et al. Jul 2006 A1
20060182331 Gilson et al. Aug 2006 A1
20060182332 Weber Aug 2006 A1
20060186194 Richardson Aug 2006 A1
20060202014 VanKirk et al. Sep 2006 A1
20060206506 Fitzpatrick Sep 2006 A1
20060208059 Cable et al. Sep 2006 A1
20060210138 Hilton et al. Sep 2006 A1
20060212391 Norman et al. Sep 2006 A1
20060212393 Brown Sep 2006 A1
20060214940 Kinoshita Sep 2006 A1
20060215204 Miyamoto et al. Sep 2006 A1
20060215230 Borrey et al. Sep 2006 A1
20060221198 Fry et al. Oct 2006 A1
20060222260 Sambongi et al. Oct 2006 A1
20060229976 Jung Oct 2006 A1
20060229986 Corder Oct 2006 A1
20060229987 Leekley Oct 2006 A1
20060238503 Smith Oct 2006 A1
20060242062 Peterson Oct 2006 A1
20060242063 Peterson Oct 2006 A1
20060248009 Hicks et al. Nov 2006 A1
20060249567 Byrne Nov 2006 A1
20060255124 Hoch Nov 2006 A1
20060274164 Kimura et al. Dec 2006 A1
20060279628 Fleming Dec 2006 A1
20060282383 Doran Dec 2006 A1
20060291744 Ikeda et al. Dec 2006 A1
20070002157 Shintani et al. Jan 2007 A1
20070005467 Haigh et al. Jan 2007 A1
20070016796 Singhal Jan 2007 A1
20070019243 Sato Jan 2007 A1
20070022053 Waserstein Jan 2007 A1
20070027802 VanDeburg et al. Feb 2007 A1
20070030357 Levien et al. Feb 2007 A1
20070030363 Cheatle et al. Feb 2007 A1
20070031022 Frew Feb 2007 A1
20070038561 Vancini et al. Feb 2007 A1
20070041629 Prakash et al. Feb 2007 A1
20070050292 Yarbrough Mar 2007 A1
20070053574 Verma et al. Mar 2007 A1
20070058851 Quine Mar 2007 A1
20070058874 Tabata et al. Mar 2007 A1
20070063016 Myatt Mar 2007 A1
20070064991 Douglas et al. Mar 2007 A1
20070065143 Didow et al. Mar 2007 A1
20070075772 Kokubo Apr 2007 A1
20070076940 Goodall et al. Apr 2007 A1
20070076941 Carreon et al. Apr 2007 A1
20070077921 Hayashi Apr 2007 A1
20070080207 Williams Apr 2007 A1
20070082700 Landschaft Apr 2007 A1
20070084911 Crowell Apr 2007 A1
20070086642 Foth Apr 2007 A1
20070086643 Spier Apr 2007 A1
20070094088 Mastie Apr 2007 A1
20070094140 Riney et al. Apr 2007 A1
20070100748 Dheer May 2007 A1
20070110277 Hayduchok et al. May 2007 A1
20070116364 Kleihorst et al. May 2007 A1
20070118472 Allen-Rouman et al. May 2007 A1
20070118747 Pintsov et al. May 2007 A1
20070122024 Haas et al. May 2007 A1
20070127805 Foth et al. Jun 2007 A1
20070129955 Dalmia Jun 2007 A1
20070130063 Jindia Jun 2007 A1
20070136198 Foth et al. Jun 2007 A1
20070138255 Carreon et al. Jun 2007 A1
20070140545 Rossignoli Jun 2007 A1
20070140594 Franklin Jun 2007 A1
20070143208 Varga Jun 2007 A1
20070150337 Hawkins et al. Jun 2007 A1
20070156438 Popadic Jul 2007 A1
20070168265 Rosenberger Jul 2007 A1
20070171288 Inoue Jul 2007 A1
20070172107 Jones Jul 2007 A1
20070172148 Hawley Jul 2007 A1
20070175977 Bauer et al. Aug 2007 A1
20070179883 Questembert Aug 2007 A1
20070183000 Eisen et al. Aug 2007 A1
20070183652 Backstrom et al. Aug 2007 A1
20070183741 Lerman et al. Aug 2007 A1
20070194102 Cohen Aug 2007 A1
20070198432 Pitroda et al. Aug 2007 A1
20070203708 Polycn et al. Aug 2007 A1
20070208816 Baldwin et al. Sep 2007 A1
20070217669 Swift et al. Sep 2007 A1
20070233585 Ben Simon et al. Oct 2007 A1
20070235518 Mueller et al. Oct 2007 A1
20070235520 Smith et al. Oct 2007 A1
20070241179 Davis Oct 2007 A1
20070244782 Chimento Oct 2007 A1
20070246525 Smith et al. Oct 2007 A1
20070251992 Sharma et al. Nov 2007 A1
20070255652 Tumminaro Nov 2007 A1
20070255653 Tumminaro Nov 2007 A1
20070255662 Tumminaro Nov 2007 A1
20070258634 Simonoff Nov 2007 A1
20070262137 Brown Nov 2007 A1
20070262148 Yoon et al. Nov 2007 A1
20070268540 Gaspardo et al. Nov 2007 A1
20070271182 Prakash et al. Nov 2007 A1
20070278286 Crowell et al. Dec 2007 A1
20070288380 Starrs Dec 2007 A1
20070288382 Narayanan et al. Dec 2007 A1
20070295803 Levine et al. Dec 2007 A1
20070299928 Kohli et al. Dec 2007 A1
20080002911 Eisen Jan 2008 A1
20080021802 Pendelton Jan 2008 A1
20080040280 Davis et al. Feb 2008 A1
20080046362 Easterly Feb 2008 A1
20080052182 Marshall Feb 2008 A1
20080059376 Davis Mar 2008 A1
20080063253 Wood Mar 2008 A1
20080065524 Matthews et al. Mar 2008 A1
20080068674 McIntyre Mar 2008 A1
20080069427 Liu Mar 2008 A1
20080071679 Foley Mar 2008 A1
20080071721 Wang Mar 2008 A1
20080080760 Ronca Apr 2008 A1
20080086420 Gilder et al. Apr 2008 A1
20080086421 Gilder Apr 2008 A1
20080091599 Foss, Jr. Apr 2008 A1
20080097899 Jackson et al. Apr 2008 A1
20080097907 Till et al. Apr 2008 A1
20080103790 Abernethy May 2008 A1
20080103967 Ackert et al. May 2008 A1
20080113674 Baig May 2008 A1
20080114739 Hayes May 2008 A1
20080116257 Fickling May 2008 A1
20080117991 Peddireddy May 2008 A1
20080119178 Peddireddy May 2008 A1
20080133411 Jones et al. Jun 2008 A1
20080140552 Blaikie Jun 2008 A1
20080147549 Ruthbun Jun 2008 A1
20080156438 Stumphauzer et al. Jul 2008 A1
20080162319 Breeden et al. Jul 2008 A1
20080162320 Mueller et al. Jul 2008 A1
20080162350 Allen-Rouman et al. Jul 2008 A1
20080162371 Rampell et al. Jul 2008 A1
20080177659 Lacey et al. Jul 2008 A1
20080180750 Feldman Jul 2008 A1
20080208727 McLaughlin et al. Aug 2008 A1
20080214180 Cunningham et al. Sep 2008 A1
20080219543 Csulits Sep 2008 A1
20080245869 Berkun et al. Oct 2008 A1
20080247629 Gilder Oct 2008 A1
20080247655 Yano Oct 2008 A1
20080249931 Gilder Oct 2008 A1
20080249951 Gilder et al. Oct 2008 A1
20080262953 Anderson Oct 2008 A1
20080275821 Bishop et al. Nov 2008 A1
20080316542 Mindrum et al. Dec 2008 A1
20090024520 Drory et al. Jan 2009 A1
20090046938 Yoder Feb 2009 A1
20090060396 Blessan et al. Mar 2009 A1
20090066987 Inokuchi Mar 2009 A1
20090076921 Nelson et al. Mar 2009 A1
20090092309 Caiman et al. Apr 2009 A1
20090108080 Meyer Apr 2009 A1
20090110281 Hirabayashi Apr 2009 A1
20090114716 Ramachandran May 2009 A1
20090132813 Schibuk May 2009 A1
20090141962 Borgia et al. Jun 2009 A1
20090166406 Pigg et al. Jul 2009 A1
20090167870 Caleca et al. Jul 2009 A1
20090171795 Clouthier et al. Jul 2009 A1
20090171819 Von der Emde et al. Jul 2009 A1
20090171825 Roman Jul 2009 A1
20090173781 Ramachadran Jul 2009 A1
20090185738 Nepomniachtchi Jul 2009 A1
20090190823 Walters Jul 2009 A1
20090192938 Amos Jul 2009 A1
20090236413 Mueller et al. Sep 2009 A1
20090252437 Li Oct 2009 A1
20090254447 Blades Oct 2009 A1
20090257641 Liu et al. Oct 2009 A1
20090281904 Pharris Nov 2009 A1
20090284637 Parulski Nov 2009 A1
20090313167 Dujari et al. Dec 2009 A1
20100007899 Lay Jan 2010 A1
20100008579 Smimov Jan 2010 A1
20100016016 Brundage et al. Jan 2010 A1
20100027679 Sunahara et al. Feb 2010 A1
20100047000 Park et al. Feb 2010 A1
20100057578 Blair et al. Mar 2010 A1
20100061446 Hands et al. Mar 2010 A1
20100078472 Lin et al. Apr 2010 A1
20100082470 Walach Apr 2010 A1
20100128131 Tenchio et al. May 2010 A1
20100165015 Barkley et al. Jul 2010 A1
20100225773 Lee Sep 2010 A1
20100226559 Najari et al. Sep 2010 A1
20100260408 Prakash et al. Oct 2010 A1
20100262522 Anderson et al. Oct 2010 A1
20100312705 Caruso et al. Dec 2010 A1
20110016084 Mundy et al. Jan 2011 A1
20110112967 Anderson et al. May 2011 A1
20110276483 Saegert et al. Nov 2011 A1
20110310442 Popadic et al. Dec 2011 A1
20120045112 Lundblad et al. Feb 2012 A1
20120099792 Chevion et al. Apr 2012 A1
20120185383 Atsmon Jul 2012 A1
20120229872 Dolev Sep 2012 A1
20130021651 Popadic et al. Jan 2013 A9
20130120595 Roach et al. May 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130198071 Jurss Aug 2013 A1
20130201534 Carlen Aug 2013 A1
20130223721 Nepomniachtchi et al. Aug 2013 A1
20130297353 Strange Nov 2013 A1
20140032406 Roach et al. Jan 2014 A1
20140067661 Elischer Mar 2014 A1
20140197922 Stanwood et al. Jul 2014 A1
20140236820 Carlton et al. Aug 2014 A1
20140258169 Wong et al. Sep 2014 A1
20140279453 Belchee et al. Sep 2014 A1
20150039528 Minogue et al. Feb 2015 A1
20150090782 Dent Apr 2015 A1
20160034590 Endras et al. Feb 2016 A1
20160142625 Weksler et al. May 2016 A1
20160335816 Thoppae et al. Nov 2016 A1
20170146602 Samp et al. May 2017 A1
20170033761 Beguesse Nov 2017 A1
Foreign Referenced Citations (21)
Number Date Country
2619884 Mar 2007 CA
1897644 Jan 2007 CN
0 984 410 Mar 2000 EP
0984410 Mar 2000 EP
1 855 459 May 2007 EP
2004-23158 Jan 2004 JP
2004-23158 Jan 2004 JP
3708807 Oct 2005 JP
2006-174105 Jun 2006 JP
20040076131 Aug 2004 KR
WO 9614707 May 1996 WO
WO 9837655 Aug 1998 WO
WO 0161436 Aug 2001 WO
WO 0161436 Aug 2001 WO
WO 2004008350 Jan 2004 WO
WO 2005043857 May 2005 WO
WO 2005124657 Dec 2005 WO
WO 2006075967 Jul 2006 WO
WO 2006086768 Aug 2006 WO
WO 2006136958 Dec 2006 WO
WO 2007024889 Mar 2007 WO
Non-Patent Literature Citations (587)
Entry
IPR2020-00882—Mitek Systems, Inc. v United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 9,818,090, dated Apr. 30, 2020, 102 pgs.
IPR2002-00975—Mitek Systems, Inc. v United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 8,977,571 dated May 22, 2020, 96 pgs.
IPR2020-00976 —Mitek Systems, Inc. v United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 8,699,779, dated May 22, 2020, 87 pgs.
IPR2020-01101—Mitek Systems, Inc. v United Services Automobile Association, Petition tor Inter Partes Review of U.S. Pat. No. 9,336,517, dated Jun. 12, 2020, 91 pgs.
IPR2019-01081—U.S. Pat. No. 9,336,517 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314; 37 C.F.R. § 42.4, dated Jan. 13, 2020, 60 pgs.
IPR2019-01082—U.S. Pat. No. 8,977,571 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314; 37 C.F.R. § 42.4, dated Dec. 13, 2019, 56 pgs.
IPR2019-01083—U.S. Pat. No. 8,977,571 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314, dated Jan. 9, 2020, 58 pgs.
Higgins, Ray et al., “Working With Image Cash Letters (ISLs) X9.37, 180 or 187 files”, All My Papers, 2009, 36 pgs.
X9.100-180, “The New ICL Standard is Published”, All My Papers, 2006, 3 pgs.
X9.37 Specifications | X9Ware LLC, dated 2018, 3 pgs.
“Getting Started with ICLs aka X9.37 Files”, All My Papers, May 2, 2006, 39 pgs.
Federal Reserve Banks Plan Black-and-White Image Standard and Quality Checks, May 2004, 2 pgs.
Caplan, J. et al., Most Influential Gadgets and Gizmos 2002: Sanyo SCP-5300, 2002, 1 pg.
Hill, S., “From J-Phone to Lumina 1020: A complete history of the camera phone”, Digital Trends, 2020, 9 pgs.
Hoffman, J., “Before there Were Smartphones, There was I-Mode”, 1999, 5 pgs.
“Vodafane calls on mobiles to go live!”, 2002, 8 pgs.
“Sprint PCS Vision Guide”, 2005, 86 pgs.
FDIC—Remote Capture: A Primer, 2009, 3 pgs.
Callaham, J., “The first camera phone was sold 20 years ago, and it's not what you expect”, Android Authority, 2019, 5 pgs.
Fujisawa, H. et al., “Information Capturing Camera and Developmental Issues”, IEEE Xplore, downloaded on Aug. 18, 2020, 4 pgs.
Rohs, M. et al., “A Conceptual Framework for Camera Phone-based Interaction Techniques”, in Pervasive Computing, Berlin Heidelberg, 2005, pp. 171-189.
Koga, M. et al., Camera-based Kanji OCR for Mobile-phones: Practical Issues, IEEE, 2005, 5 pgs.
Parikh, T., “Using Mobile Phones for Secure, Distributed Document Processing in the Developing World”, IEE Persuasive Computing, vol. 4, No. 2, 2005, 9 pgs.
Parikh, T., “Mobile Phones and Paper Documents: Evaluating a New Approach for Capturing Microfinance Data in Rural India”, CHI 2006 Proceedings, 2006, 10 pgs.
Magid, L., “A baby girl and the camera phone were born 20 years ago”, Mercury News, 2017, 3 pgs.
Liang, J. et al., “Camera-based analysis of text and documents: a survey”, IJDAR, vol. 7, 2005, pp. 84-104, 21, pgs.
Gutierrez, L., “Innovation: From Campus to Startup”, Business Watch, 2008, 2 pgs.
Doermann, D. et al., “The function of documents”, Image and Vision Computing, vol. 16, 1998, pp. 799-814.
Mirmehdi, M. et al., “Towards Optimal Zoom for Automatic Target Recognition”, in Proceedings of the Scandinavian Conference on Image Analysis, 1:447-454, 1997, 7 pgs.
Mirmehdi, M. et al., “Extracting Low Resolution Text with an Active Camera for OCR”, in Proccedings of the IX Spanish Symposium on Pattern Recognition and Image Processing (pp. 43-48), 2001, 6 pgs.
Zandifar, A. et al., “A Video Based Interface To Textual Information For The Visually Impaired”, IEEE 17th International Symposium on Personal, Indoor and Mobile Radio Communications, 1-5, 2002, 6 pgs.
Laine, M. et al., “A Standalone OCR System For Mobile Cameraphones”, IEEE, 2006, 5 pgs.
Federal Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services, 2004, 2 pgs.
Dhandra, B.V. et al., “Skew Detection in Binary Image Documents Based on Image Dilation and Region labeling Approach”, IEEE, The 18th International Conference on pattern Recognition (ICPR'06), 2006, 4 pgs.
PNC Bank to Offer Ease of Online Deposit Service Integrated QuickBooks to Small Business, RemoteDepositCapture.com, Jul. 24, 2006, 2 pgs.
Sony Ericsson K800i, User Manual, Part 1, 2006, 98 pgs.
Nokia N90 User Guide, 2005, 132 pgs.
Nokia N90 Phone Features, 2005, 4 pgs.
Sprint PCS Vision Picture Phone, PM-8920 by Audiovox, User's Manual, Part 1, 2004, 103 pgs.
Sprint PCS Vision Picture Phone, PM-8920 by Audiovox, User's Manual, Part 2, 2004, 103 pgs.
Pappas, A., “Taking Sharper Pictures Is Now a Snap as Sprint Launches First 1.3-Megapixal Camera Phone in the United States”, 2004, 2 pgs.
Sony Ericsson K800i—Product Overview, 2006, 2 pgs.
“Accept “Customer Not Present” Checks,”Accept Check Online, http://checksoftware.com, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Adjusting Brightness and Contrast”, www.eaglesoftware.com/adjustin.htm, retrieved on May 4, 2009 (4 pgs).
“Best practices for producing quality digital image files,” Digital Images Guidelines, http://deepblue.lib.umich.edu/bitstream/2027.42/40247/1/Images-Best_Practice.pdf, downloaded 2007 (2 pgs).
“Chapter 7 Payroll Programs,” Uniform Staff Payroll System, http://www2.oecn.k12.oh.us/www/ssdt/usps/usps_user_guide_005.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (9 pgs).
“Check 21—The check is not in the post”, RedTitan Technology 2004 http://www.redtitan.com/check21/htm (3 pgs).
“Check 21 Solutions,” Columbia Financial International, Inc. http://www.columbiafinancial.us/check21/solutions.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
“Check Fraud: A Guide to Avoiding Losses”, All Net, http://all.net/books/audit/checkfraud/security.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Clearing House Electronic Check Clearing System (CHECCS) Operating Rules,” An IP.com Prior Art Database Technical Disclosure, Jul. 29, 2015 (35 pgs).
“Compliance with Regulation CC”, http./www/federalreserve.gov/Pubs/regcc/regcc.htm, Jan. 24, 2006 (6 pgs).
“Customer Personalized Bank Checks and Address Labels” Checks Your Way Inc., http://www.checksyourway.com/htm/web_pages/faq.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (6 pgs).
“Deposit Now: Quick Start User Guide,” BankServ, 2007, 2 pages.
“Direct Deposit Application for Payroll”, Purdue University, Business Office Form 0003, http://purdue.edu/payroll/pdf/directdepositapplication.pdf, Jul. 2007 (2 pgs).
“Direct Deposit Authorization Form”, www.umass.edu/humres/library/DDForm.pdf, May 2003 (3 pgs).
“Direct Deposit,” University of Washington, http://www.washington.edu/admin/payroll/directdeposit.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Electronic Billing Problem: The E-check is in the mail” American Banker—vol. 168, No. 95, May 19, 2003 (4 pgs).
“First Wireless Handheld Check and Credit Card Processing Solution Launched by Commericant®, MobileScape® 5000 Eliminates Bounced Checks, Enables Payments Everywhere,” Business Wire, Mar. 13, 2016, 3 pages.
“Frequently Asked Questions” Bank of America, http://www/bankofamerica.com/deposits/checksave/index.cfm?template-lc_faq_bymail, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (2 pgs).
“Full Service Direct Deposit”, www.nonprofitstaffing.com/images/upload/dirdepform.pdf. Cited in U.S. Pat. No. 7,900,822, as dated 2001, (2 pgs).
“How to Digitally Deposit a Check Image”, Smart Money Daily, Copyright 2008 (5 pgs).
“ImageNet Mobile Deposit Provides Convenient Check Deposit and Bill Pay to Mobile Consumers,” Miteksystems, 2008 (2 pgs).
“It's the easiest way to Switch banks”, LNB, http://www.inbky.com/pdf/LNBswitch-kit10-07.pdf Cited in U.S. Pat. No. 7,996,316, as dated 2007 (7 pgs).
“Lesson 38—More Bank Transactions”, Turtle Soft, http://www.turtlesoft.com/goldenseal-software-manual.lesson38.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
“Middleware”, David E. Bakken, Encyclopedia of Distributed Computing, Kluwer Academic Press, 2001 (6 pgs).
“Mitek Systems Announces Mobile Deposit Application for Apple iPhone,” http://prnewswire.com/cgi-bin/stories/pl?ACCT=104&STORY=/www/story/10-01- . . . , Nov. 25, 2008 (2 pgs).
“NOVA Enhances Electronic Check Service to Benefit Multi-Lane Retailers,” Business Wire, Nov. 28, 2006, 2 pages.
“Personal Finance”, PNC, http://www.pnc.com/webapp/unsec/productsandservice.do?sitearea=/PNC/home/personal/account+services/quick+switch/quick+switch+faqs, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
“Quicken Bill Pay”, Retrieved from the Internet on Nov. 27, 2007 at: <URL:http://quicken intuit.com/quicken-bill-pay-jhtml>, 2 pgs.
“Refractive index” Wikipedia, the free encyclopedia; http://en.wikipedia.org./wiki/refractiveindex.com Oct. 16, 2007 (4 pgs).
“Remote check deposit is the answer to a company's banking problem,” Daily Breeze, Torrance, CA, Nov. 17, 2006, 2 pgs.
“Remote Deposit Capture”, Plante & Moran, http://plantemoran.com/industries/fincial/institutions/bank/resources/community+bank+advisor/2007+summer+issue/remote+deposit+capture.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Remote Deposit” National City, http://www.nationalcity.com/smallbusiness/cashmanagement/remotedeposit/default.asp; Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Save on ATM Fees”, RedEye Edition, Chicago Tribune, Chicago, IL Jun. 30, 2007 (2 pgs).
“SNB Check Capture: SmartClient User's Guide,” Nov. 2006, 21 pgs.
“Start to Simplify with Check Imaging a Smarter Way to Bank”, Retrieved from the Internet on Nov. 27, 2007, at: <URL: http://www.midnatbank.com/Internet%20Banking/internet_Banking.html>, 3 pgs.
“Switching Made Easy,” Bank of North Georgia, http://www.banknorthgeorgia.com/cmsmaster/documents/286/documents616.pdf, 2007 (7 pgs).
“Two Words Every Business Should Know: Remote Deposit,” Canon, http://www.rpsolutions.com/rpweb/pdfs/canon_rdc.pdf, 2005 (7 pgs).
“Virtual Bank Checks”, Morebusiness.com, http://www.morebusiness.com/running_yourbusiness/businessbits/d908484987.brc, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“WallStreetGrapevine.com” Stocks on the Rise: JADG, BKYI, MITK; Mar. 3, 2008 (4 pgs).
“What is check Fraud”, National Check Fraud Center, http://www.ckfraud.org/ckfraud.html , Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
“Exchangeable image file format for digital still cameras: Exif Version 2.2,” Standard of Electronics and Information Technology Industries Associate, JEITA CP-3451, Technical Standardization Committee on AV & IT Storage Systems and Equipments, Japan Electronics and Information Technology Industries Association, Apr. 2002 (154 pgs). (retrieved from: http://www.exif.org/Exif2-2.PDF).
“Machine Accepts Bank Deposits”, New York Times, Apr. 12, 1961, 1 pg.
12 CRF § 229.51 and Aggendix D to Part 229 (Jan. 1, 2005 edition), 3 pgs.
149 Cong. Rec. H9289, Oct. 8, 2003, 6 pgs.
Affinity Federal Credit Union, “Affinity Announces Online Deposit,” Aug. 4, 2005 (1 pg).
Albrecht, W. Steve, “Check Kiting: Detection, Prosecution and Prevention,” The FBI Law Enforcement Bulletin, Nov. 1, 1993 (6 pgs).
Alves, Vander and Borba, Paulo; “Distributed Adapters Pattern: A Design for Object-Oriented Distributed Applications”; First Latin American Conference on Pattern Languages of Programming; Oct. 2001; pp. 132-142; Rio de Janeiro, Brazil (11 pgs).
Amber Avalona-Butler / Paraglide, “At Your Service: Best iPhone Apps for Military Lifestyle,” Jul. 9, 2010 (2 pgs).
Anderson, Milton M. “FSML and Echeck”, Financial Services Technology Consortium, 1999 (17 pgs).
Apple Announces the New iPhone 3GS—The Fastest, Most Powerful iPhone Yet, Jun. 8, 2009, located on the Internet at: http://www.apple.com.rensroom/2009/06/08Apple-Annpounces-the-New-iPhone-3GS-The Fastest-Most-Powerful-iPhone-Yet, 4 pgs.
Apple Reinvents the Phone with iPhone, Jan. 2007, located on the Internet at: https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/, 4 pgs.
Application as filed Jun. 25, 2007 for U.S. Appl. No. 11/861,164 (39 pgs).
Application as filed on Apr. 3, 2008 for U.S. Appl. No. 12/062,143 (27 pgs).
Application as filed on Aug. 19, 2010 for U.S. Appl. No. 12/859,741 (235 pgs).
Application as filed on Aug. 21, 2008 for U.S. Appl. No. 12/195,723 (38 pgs).
Application as filed on Aug. 21, 2009 for U.S. Appl. No. 12/545,127 (45 pgs).
Application as filed on Aug. 28, 2009 for U.S. Appl. No. 12/549,443 (41 pgs).
Application as filed on Dec. 20, 2006 for U.S. Appl. No. 11/613,656 (21 pgs).
Application as filed on Dec. 29, 2005 for U.S. Appl. No. 11/321,025 (19 pgs).
Application as filed on Dec. 30, 2010 for U.S. Appl. No. 12/982,494 (280 pgs).
Application as filed on Dec. 30, 2010 for U.S. Appl. No. 12/982,561 (275 pgs).
Application as filed on Dec. 30, 2010 for U.S. Appl. No. 12/982,578 (274 pgs).
Application as filed on Dec. 30, 2010 for U.S. Appl. No. 12/982,594 (275 pgs).
Application as filed on Feb. 15, 2012 for U.S. Appl. No. 13/397,405 (19 pgs).
Application as filed on Feb. 18, 2009 for U.S. Appl. No. 12/388,005 (37 pgs).
Application as filed on Jan. 6, 2017 for U.S. Appl. No. 15/400,350 (62 pgs).
Application as filed on Jan. 7, 2013 for U.S. Appl. No. 13/735,678 (30 pgs).
Application as filed on Jul. 13, 2006 for U.S. Appl. No. 11/487,537 (23 pgs).
Application as filed on Jul. 27, 2009 for U.S. Appl. No. 12/509,613 (48 pgs).
Application as filed on Jul. 27, 2009 for U.S. Appl. No. 12/509,680 (41 pgs).
Application as filed on Jun. 11, 2008 for U.S. Appl. No. 12/137,051 (29 pgs).
Application as filed on Jun. 8, 2011 for U.S. Appl. No. 13/155,976 (352 pgs).
Application as filed on Jun. 8, 2011 for U.S. Appl. No. 13/156,007 (356 pgs).
Application as filed on Jun. 8, 2011 for U.S. Appl. No. 13/156,018 (353 pgs).
Application as filed on Mar. 15, 2007 for U.S. Appl. No. 11/686,924 (34 pgs).
Application as filed on Mar. 15, 2007 for U.S. Appl. No. 11/686,928 (36 pgs).
Application as filed on Mar. 15, 2013 for U.S. Appl. No. 13/842,112 (62 pgs).
Application as filed on Mar. 4, 2009 for U.S. Appl. No. 12/397,671 (40 pgs).
Application as filed on Mar. 4, 2009 for U.S. Appl. No. 12/397,930 (37 pgs).
Application as filed on May 10, 2007 for U.S. Appl. No. 11/747,222 (35 pgs).
Application as filed on Oct. 17, 2008 for U.S. Appl. No. 12/253,278 (42 pgs).
Application as filed on Oct. 23, 2007 for U.S. Appl. No. 11/876,925 (36 pgs).
Application as filed on Oct. 23, 2007 for U.S. Appl. No. 11/877,335 (29 pgs).
Application as filed on Oct. 25, 2007 for U.S. Appl. No. 11/923,839 (22 pgs).
Application as filed on Oct. 29, 2007 for U.S. Appl. No. 11/926,388 (23 pgs).
Application as filed on Oct. 30, 2007 for U.S. Appl. No. 11/928,297 (26 pgs).
Application as filed on Oct. 31, 2006 for U.S. Appl. No. 11/590,974 (31 pgs).
Application as filed on Oct. 31, 2006 for U.S. Appl. No. 11/591,008 (27 pgs).
Application as filed on Oct. 31, 2006 for U.S. Appl. No. 11/591,227 (58 pgs).
Application as filed on Oct. 31, 2006 for U.S. Appl. No. 11/591,273 (56 pgs).
Application as filed on Oct. 31, 2007 for U.S. Appl. No. 11/930,537 (27 pgs).
Application as filed on Oct. 31, 2007 for U.S. Appl. No. 11/931,670 (47 pgs).
Application as filed on Oct. 8, 2007 for U.S. Appl. No. 11/868,884 (30 pgs).
Application as filed on Sep. 28, 2007 for U.S. Appl. No. 11/864,569 (35 pgs).
Application as filed on Sep. 8, 2008 for U.S. Appl. No. 12/205,996 (30 pgs).
Aradhye, Hrishikesh B., “A Generic Method for Determining Up/Down Orientation of Text in Roman and Non-Roman Scripts,” Pattern Recognition Society, Dec. 13, 2014, 18 pages.
Archive Index Systems; Panini My Vision X-30 or VX30 or X30 © 1994-2008 Archive Systems, Inc. P./O. Box 40135 Bellevue, WA USA 98015 (2 pgs).
Askey, Canon EOS 40D Review (pts. 1,4,10), Digital Photography Review, located on the Internet at:http: www.dpreview.com/reviews/canoneos40d, 24 pgs.
Askey, Leica Digilux 2 Review (pts. 1,3,7), Digital Photography Review, May 20, 2004, located on the Internet at: https://www.dpreview.com/reviews/leicadigilux2, 20 pgs.
Askey, Nikon D300 In-depth Review (pts.1,3,9), Digital Photography Review, Mar. 12, 2008, located on the Internet at: https://www.preview.com/reviews/nikond300, 24 pgs.
Askey, Panasonic Lumix DMC-L1 Review (pts.1,3,7), Digital Photography Review, Apr. 11, 2007, located on the Internet at: https://www.dpreview.com/reviews/panasonicdmc11, 24 pgs.
Askey, Sony Cyber-shot DSC-R1 Review (pts, 1,3,7), Digital Photography Review, Dec. 6, 2005, located on the Internet at: http://www.dpreview.com.reviews/sonydscr1, 24 pgs.
Associate of German Banks, SEPA 2008: Uniform Payment Instruments for Europe, Berlin, Cited in U.S. Pat. No. 7,900,822, as dated Jul. 2007, Bundesverbankd deutscher banker ev (42 pgs).
Automated Clearing Houses (ACHs), Federal Reserve Bank of New York (May 2000) available at: https://www.newyorkfed.org/aboutthefed/fedpoint/fed31.html, (attached as Exhibit 12 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 4 pgs.
Automated Merchant Systems, Inc., “Electronic Check Conversion,” http://www.automatedmerchant.com/electronic_check_conversion.cfm, 2006, downloaded Oct. 18, 2006 (3 pgs).
Bank Systems & Technology, Untitled Article, May 1, 2006, http://www.banktech.com/showarticle.jhtml? articleID=187003126, “Are you Winning in the Payment World?” (4 pgs).
Bankserv, “DepositNow: What's the difference?” Cited in U.S. Pat. No. 7,970,677, as dated 2006, (4 pgs).
Bankserv, Product Overview, http://www.bankserv.com/products/remotedeposit.htm, Cited in U.S. Pat. No. 7,970,677, as dated 2006, (3 pgs).
Berman, How Hitchcock Turned a Small Budget Into a Great Triumph, Time.com, Apr. 29, 2015, located on the Internet at: http://time.com/3823112/alfred-hitchcock-shadow-of-a-doubt, 1 pg.
Big Red Book, Adobe Systems Incorporated, copyright 2000, (attached as Exhibit 27 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 45 pgs.
Bills, Steve, “Automated Amount Scanning Is Trend in Remote-Deposit,” American Banker, New York, NY, Aug. 30, 2005, (3 pgs).
Blafore, Bonnie “Lower Commissions, Fewer Amenities”, Better Investing, Madison Heights: Feb. 2003, vol. 52, Iss 6, (4 pgs).
BLM Technologies, “Case Study: Addressing Check 21 and RDC Error and Fraud Threats,” Remote Deposit Capture News Articles from Jun. 11, 2007, Retrieved from http://www.remotedepositcapture.com/News/june_11_2007.htm on Feb. 19, 2008 (5 pgs).
Blue Mountain Consulting, from URL: www.bluemontainconsulting.com, Cited in U.S. Pat. No. 7,900,822, as dated Apr. 26, 2006 (3 pgs).
Board of Governors of the federal reserve system, “Report to the Congress on the Check Clearing for the 21st Century Act of 2003” Apr. 2007, Submitted to Congress pursuant to section 16 of the Check Clearing for the 21st Century Act of 2003, (59 pgs).
Braun, Tim, “Camdesk—Towards Portable and Easy Document Capture,” Image Understanding and Pattern Recognition Research Group, Department of Computer Science, University of Kaiserslautern, Technical Report, Mar. 29, 2005 (64 pgs). (Retrieved from https://pdfs.semanticscholar.org/93b2/ea0d12f24c91f3c46fa1c0d58a76bb132bd2.pdf).
Brian Chen et al., iPhone 3GS Trounces Predecessors, Rivals in Web Browser Speed Test, Wired, Jun. 24, 2009, located on the Internet at: www.wired.com/2009.3gs-speed/, 10 pgs.
Bruene, Jim; “Check Free to Enable In-Home Remote Check Deposit for Consumers and Small Business”, NetBanker. Com, Financial Insite, Inc., http://www. netbanker.com/2008/02/checkfree_to_enableinhome_rem.html, Feb. 5, 2008 (3 pgs).
Bruene, Jim; “Digital Federal Credit Union and Four Others Offer Consumer Remote Deposit Capture Through EasCorp”, NetBanker—Tracking Online Finance, www.netbanker.com/2008/04/digital_federal_credit_union_a.html, Apr. 13, 2008 (3 pgs).
Bruno, M., “Instant Messaging,” Bank Technology News, Dec. 2002 (3 pgs).
Burnett, J. “Depository Bank Endorsement Requirements,” BankersOnline.com, http://www.bankersonline.com/cgi-bin/printview/printview.pl, Jan. 6, 2003 (3 pgs).
Canon EOS 40D Digital Camera Instruction Manual, located on the Internet at: http://gdlp01.c-wss.com/gds/6/0900008236/01/EOS40D_HG_EN.pdf (attached as Exhibit 6 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 38 pgs.
Canon, ImageFormula CR-25/CR-55, “Improve Your Bottom Line with Front-Line Efficiencies”, 0117W117, 1207-55/25-1 OM-BSP, Cited in U.S. Pat. No. 7,949,587 as dated 2007. (4 pgs).
Carrubba, P. et al., “Remote Deposit Capture: A White Paper Addressing Regulatory, Operational and Risk Issues,” NetDeposit Inc., 2006 (11 pgs).
Century Remote Deposit High-Speed Scanner User's Manual Release 2006, (Century Manual), Century Bank, 2006, (32 pgs).
Check Clearing for the 21st Century Act Foundation for Check 21 Compliance Training, Federal Financial Institutions Examination Council, (Oct. 16, 2004), available on the Internet at: https://web.archive.org/web/20041016100648/https://www.ffiec,gov/exam/check21/check21foundationdoc.htm, (excerpts attached as Exhibit 20 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 11 pgs.
Check Clearing for the 21st Century Act, H. R. Rep. No. 108-132, Jun. 2, 2003, 20 pgs.
Chiang, Chuck, The Bulletin, “Remote banking offered”, http://bendbulletin.com/apps/pbcs.dll/article?AID=/20060201/BIZ0102/602010327&templ . . . , May 23, 2008 (2 pgs).
Claims as filed Jan. 24, 2018 for U.S. Appl. No. 15/878,821 (5 pgs).
Claims as filed Jan. 31, 2018 for U.S. Appl. No. 15/884,990 (6 pgs).
Claims as filed May 18, 2018 for U.S. Appl. No. 15/983,983 (3 pgs).
Claims as filed on Apr. 1, 2013 for U.S. Appl. No. 13/854,521 (5 pgs).
Claims as filed on Apr. 3, 2008 for U.S. Appl. No. 12/062,163 (3 pgs).
Claims as filed on Apr. 3, 2008 for U.S. Appl. No. 12/062,175 (3 pgs).
Claims as filed on Apr. 30, 2013 for U.S. Appl. No. 13/874,145 (5 pgs).
Claims as filed on Apr. 9, 2018 for U.S. Appl. No. 15/948,510 (5 pgs).
Claims as filed on Apr. 9, 2018 for U.S. Appl. No. 15/948,549 (5 pgs).
Claims as filed on Aug. 19, 2010 for U.S. Appl. No. 12/859,752 (5 pgs).
Claims as filed on Aug. 21, 2009 for U.S. Appl. No. 12/545,127 (5 pgs).
Claims as filed on Dec. 15, 2011 for U.S. Appl. No. 13/327,478 (4 pgs).
Claims as filed on Dec. 20, 2006 for U.S. Appl. No. 11/613,671 (3 pgs).
Claims as filed on Dec. 20, 2012 for U.S. Appl. No. 13/722,576 (4 pgs).
Claims as filed on Dec. 28, 2016 for U.S. Appl. No. 15/392,950 (5 pgs).
Claims as filed on Dec. 29, 2005 for U.S. Appl. No. 11/320,998 (3 pgs).
Claims as filed on Dec. 29, 2005 for U.S. Appl. No. 11/321,027 (3 pgs).
Claims as filed on Dec. 8, 2010 for U.S. Appl. No. 12/963,513 (7 pgs).
Claims as filed on Dec. 9, 2015 for U.S. Appl. No. 14/964,279 (5 pgs).
Claims as filed on Feb. 12, 2013 for U.S. Appl. No. 13/765,412 (1 pg).
Claims as filed on Feb. 15, 2012 for U.S. Appl. No. 13/397,437 (6 pgs).
Claims as filed on Feb. 16, 2011 for U.S. Appl. No. 13/028,477 (3 pgs).
Claims as filed on Feb. 19, 2013 for U.S. Appl. No. 13/770,048 (4 pgs).
Claims as filed on Feb. 3, 2016 for U.S. Appl. No. 15/014,918 (5 pgs).
Claims as filed on Jan. 20, 2011 for U.S. Appl. No. 13/010,644 (9 pgs).
Claims as filed on Jan. 31, 2011 for U.S. Appl. No. 13/017,865 (11 pgs).
Claims as filed on Jul. 19, 2017 for U.S. Appl. No. 15/654,497 (1 pg).
Claims as filed on Jul. 28, 2017 for U.S. Appl. No. 15/663,284 (6 pgs).
Claims as filed on Jul. 28, 2017 for U.S. Appl. No. 15/663,305 (6 pgs).
Claims as filed on Jun. 12, 2015 for U.S. Appl. No. 14/738,340 (4 pgs).
Claims as filed on Jun. 13, 2012 for U.S. Appl. No. 13/495,971 (36 pgs).
Claims as filed on Jun. 15, 2016 for U.S. Appl. No. 15/183,461 (36 pgs).
Claims as filed on Jun. 20, 2013 for U.S. Appl. No. 13/922,686 (7 pgs).
Claims as filed on Jun. 9, 2014 for U.S. Appl. No. 14/299,456 (36 pgs).
Claims as filed on Mar. 15, 2007 for U.S. Appl. No. 11/686,925 (5 pgs).
Claims as filed on Mar. 20, 2014 for U.S. Appl. No. 14/220,799 (1 pg).
Claims as filed on Mar. 23, 2017 for U.S. Appl. No. 15/467,167 (4 pgs).
Claims as filed on Mar. 25, 2014 for U.S. Appl. No. 14/224,944 (4 pgs).
Claims as filed on Mar. 25, 2014 for U.S. Appl. No. 14/225,090 (1 pg).
Claims as filed on Mar. 3, 2014 for U.S. Appl. No. 14/195,482 (4 pgs).
Claims as filed on May 10, 2007 for U.S. Appl. No. 11/747,223 (4 pgs).
Claims as filed on May 18, 2011 for U.S. Appl. No. 13/110,077 (9 pgs).
Claims as filed on May 2, 2011 for U.S. Appl. No. 13/098,566 (10 pgs).
Claims as filed on Nov. 20, 2012 for U.S. Appl. No. 13/682,268 (4 pgs).
Claims as filed on Nov. 23, 2016 for U.S. Appl. No. 15/360,738 (3 pgs).
Claims as filed on Nov. 25, 2015 for U.S. Appl. No. 14/952,625 (1 pg).
Claims as filed on Nov. 7, 2016 for U.S. Appl. No. 15/345,190 (5 pgs).
Claims as filed on Oct. 9, 2015 for U.S. Appl. No. 14/879,868 (4 pgs).
Claims as filed on Oct. 16, 2014 for U.S. Appl. No. 14/516,335 (4 pgs).
Claims as filed on Oct. 16, 2014 for U.S. Appl. No. 14/516,350 (4 pgs).
Claims as filed on Oct. 16, 2014 for U.S. Appl. No. 14/516,364 (4 pgs).
Claims as filed on Oct. 2, 2017 for U.S. Appl. No. 15/722,836 (4 pgs).
Claims as filed on Oct. 23, 2007 for U.S. Appl. No. 11/877,382 (6 pgs).
Claims as filed on Oct. 24, 2008 for U.S. Appl. No. 12/257,471 (4 pgs).
Claims as filed on Oct. 31, 2006 for U.S. Appl. No. 11/590,963 (3 pgs).
Claims as filed on Oct. 31, 2006 for U.S. Appl. No. 11/590,995 (3 pgs).
Claims as filed on Oct. 31, 2006 for U.S. Appl. No. 11/590,998 (4 pgs).
Claims as filed on Oct. 31, 2007 for U.S. Appl. No. 11/931,804 (4 pgs).
Claims as filed on Oct. 8, 2007 for U.S. Appl. No. 11/868,878 (4 pgs).
Claims as filed on Sep. 14, 2012 for U.S. Appl. No. 13/619,026 (3 pgs).
Claims as filed on Sep. 2, 2008 for U.S. Appl. No. 12/202,781 (4 pgs).
Claims as filed on Sep. 8, 2008 for U.S. Appl. No. 12/206,001 (3 pgs).
Claims as filed on Sep. 8, 2008 for U.S. Appl. No. 12/206,007 (3 pgs).
Claims as filed Sep. 19, 2017 for U.S. Appl. No. 15/709,071 (1 pgs).
Claims as filed Sep. 19, 2017 for U.S. Appl. No. 15/709,126 (1 pgs).
Claims as filed Sep. 19, 2017 for U.S. Appl. No. 15/709,143 (1 pgs).
Claims as filed Sep. 8, 2014 for U.S. Appl. No. 14/479,478 (5 pgs).
CNN.com/technology, “Scan, deposit checks from home”, www.cnn.com/2008ITECH/biztech/02/07/check.scanning.ap/index.html, Feb. 7, 2008 (3 pgs).
Constanzo, Chris, “Remote Check Deposit: Wells Captures A New Checking Twist”, Bank Technology News Article—May 2005, www.americanbanker.com/btn_article.html?id=20050502YQ50FSYG (2 pgs).
Craig, Ben, “Resisting Electronic Payment Systems: Burning Down the House?”, Federal Reserve Bank of Cleveland, Jul. 1999 (4 pgs).
Creativepaymentsolutions.com, “Creative Payment Solutions—Websolution,” www.creativepaymentsolution.com/cps/financialservices/websolution/default.html, Copyright 2008, Creative Payment Solutions, Inc. (1 pg).
Credit Union Journal, “The Ramifications of Remote Deposit Capture Success”, www.cuiournal.com/orintthis.html?id=20080411 EODZT57G, Apr. 14, 2008 (1 pg).
Credit Union Journal, “AFCU Averaging 80 DepositHome Transactions Per Day”, Credit Union Journal, Aug. 15, 2005 (1 pg).
Credit Union Management, “When You wish Upon an Imaging System . . . the Right Selection Process can be the Shining Star,” Credit Union Management, Aug. 1993, printed from the internet at <http://search.proquest.com/docview/227756409/14138420743684F7722/15?accountid=14 . . . >, on Oct. 19, 2013 (11 pgs).
David B. Humphrey & Robert Hunt, Getting Rid of Paper: Savings From Check 21, Working Paper No. 12-12, Research Department, Federal Reserve Bank of Philadelphia, (May 2012), available on the Internet at: https://philadelphiafed.org/-/media/research-and-data/publications/working-papers/2012/wp12-12.pdf (attached as Exhibit 14 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 29 pgs.
DCU Member's Monthly—Jan. 2008, “PC Deposit—Deposit Checks from Home!”, http://www.mycreditunionnewsletter.com/dcu/01 08/page1. html, Copyright 2008 Digital Federal Credit Union (2 pgs).
De Jesus, A. et al., “Distributed Check Processing in a Check 21 Environment: An educational overview of the opportunities and challenges associated with implementing distributed check imaging and processing solutions,” Panini, 2004, pp. 1-22.
De Queiroz, Ricardo et al., “Mixed Raster Content (MRC) Model for Compound Image Compression”, 1998 (14 pgs).
Debello, James et al., “RDM and Mitek Systems to Provide Mobile Check Deposit,” Mitek Systems, Inc., San Diego, California and Waterloo, Ontario, (Feb. 24, 2009), 2 pgs.
Declaration of Peter Alexander, Ph.D., CBM2019-0004, Nov. 8, 2018, 180 pgs.
Defendant Wells Fargo Bank, N.A.'s Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, dated Aug. 14, 2018, 64 pgs.
DeYoung, Robert; “The Financial Performance of Pure Play Internet Banks”; Federal Reserve Bank of Chicago Economic Perspectives; 2001; pp. 60-75; vol. 25, No. 1 (16pgs).
Dias, Danilo et al., “A Model for the Electronic Representation of Bank Checks”, Brasilia Univ. Oct. 2006 (5 pgs).
Digital Transactions News, “An ACH-Image Proposal For Check Roils Banks and Networks” May 26, 2006 (3 pgs).
Dinan, R.F. et al., “Image Plus High Performance Transaction System”, IBM Systems Journal, 1990 vol. 29, No. 3 (14 pgs).
Doermann, David et al., “Progress in Camera-Based Document Image Analysis,” Proceedings of the Seventh International Conference on Document Analysis and Recognition (ICDAR 2003) 0-7695-1960-1/03, 2003 IEEE (11 pages).
Doermann, David, et al., Progress in Camera-Based Document Image Analysis, Proceedings of the Seventh Int'l Conf. on Document Analysis and Recognition, 2003, 11 pages.
Duvall, Mel, “Remote Deposit Capture,” Baseline, vol. 1, Issue 70, Mar. 2007, 2 pgs.
E. MacKenzie, Photography Made Easy, copyright 1845, 80 pgs.
ECU Technologies, “Upost Remote Deposit Solution,” Retrieved from the internet https://www.eutechnologies.com/products/upost.html, downloaded 2009 (1 pg).
EFT Network Unveils FAXTellerPlus, EFT Network, Inc., www.eftnetwork.com, Jan. 13, 2009 (2 pgs).
ElectronicPaymentProviders, Inc., “FAQs: ACH/ARC, CheckVerification/Conversion/Guarantee, RCK Check Re-Presentment,” http://www.useapp.com/faq.htm, downloaded Oct. 18, 2006 (3 pgs).
Ex Parte Quayle Action from corresponding U.S. Appl. No. 12/545,127 dated Jun. 25, 2014 (6 pgs).
Excerpts from American National Standard for Financial Services, ANS, X9.100-140-2004—Specifications for an Image Replacement Document—IRD, Oct. 1, 2004, 16 pgs.
Federal Check 21 Act, “New Check 21 Act effective Oct. 28, 2004: Bank No Longer Will Return Original Cancelled Checks,” Consumer Union's FAQ's and Congressional Testimony on Check 21, www.consumerlaw.org.initiatives/content/check21_content.html, Cited in U.S. Pat. No. 7,873,200, as dated Dec. 2005 (20 pgs).
Federal Reserve Board, “Check Clearing for the 21st Century Act”, FRB, http://www.federalreserve.gov/paymentsystems/truncation/, Mar. 1, 2006 (1 pg).
Federal Reserve System, “12 CFR, Part 229 [Regulation CC]: Availability of Funds and Collection of Checks,” Federal Registrar, Apr. 28, 1997, pp. 1-50.
Federal Reserve System, “Part IV, 12 CFR Part 229 [Regulation CC]: Availability of Funds and Collection of Checks; Final Rule,” Federal Registrar, vol. 69, No. 149, Aug. 4, 2004, pp. 47290-47328.
Fest, Glen., “Patently Unaware” Bank Technology News, Apr. 2006, Retrieved from the internet at URL:http://banktechnews.com/article.html?id=2006403T7612618 (5 pgs).
Fidelity Information Services, “Strategic Vision Embraces Major Changes in Financial Services Solutions: Fidelity's long-term product strategy ushers in new era of application design and processing,” Insight, 2004, pp. 1-14.
Fisher, Dan M., “Home Banking in the 21st Century: Remote Capture Has Gone Retail”, May 2008 (4 pgs).
Furst, Karen et al., “Internet Banking: Developments and Prospects”, Economic and Policy Analysis Working Paper 2000-9, Sep. 2000 (60 pgs).
Garry, M., “Checking Options: Retailers face an evolving landscape for electronic check processing that will require them to choose among several scenarios,” Supermarket News, vol. 53, No. 49, 2005 (3 pgs).
Gates, A History of Wireless Standards, Wi-Fi Back to Basics, Areohive Blog, Jul. 2015, located on the Internet at: http://blog.aerohine.com/a-history-of-wireless-standards, 5 pgs.
German Shegalov, Diplom-Informatiker, “Integrated Data, Message, and Process Recovery for Failure Masking in Web Services”, Dissertation Jul. 2005 (146 pgs).
Gupta, Amar et al., “An Integrated Architecture for Recognition of Totally Unconstrained Handwritten Numerals”, WP#3765, Jan. 1993, Productivity from Information Technology “Profit” Research Initiative Sloan School of Management (20 pgs).
Gupta, Maya R. et al., “OCR binarization and image pre-processing for searching historical documents,” Pattern Recognition, vol. 40, No. 2, Feb. 2007, pp. 389-397.
Hale, J., “Picture this: Check 21 uses digital technology to speed check processing and shorten lag time,” Columbus Business First, http://columbus.bizjournals.com/columbus/stories/2005/03/14focus1.html, downloaded 2007 (3 pgs).
Hartly, Thomas, “Banks Check Out New Image”, Business First, Buffalo: Jul. 19, 2004, vol. 20, Issue 43, (3 pgs).
Heckenberg, D. “Using Mac OS X for Real-Time Image Processing” Oct. 8, 2003 (15 pgs).
Helio Ocean User Manual, located on the Internet at: https://standupwireless.com/wp-content/uploads/2017/04/Manual_PAN-TECH_OCEAN.pdf (excerpts attached as Exhibit 10 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 76 pgs.
Hildebrand, C. et al., “Electronic Money,” Oracle, http://www.oracle.com/oramag/profit/05-feb/p15financial.html, 2005, downloaded Oct. 18, 2006 (5 pgs).
Hillebrand, G., “Questions and Answers About the Check Clearing for the 21st Century Act, 'Check 21,” ConsumersUnion.org, http://www.consumersunion.org/finance/ckclear1002.htm, Jul. 27, 2004, downloaded Oct. 18, 2006 (6 pgs).
HTC Touch Diamond Manual, copyright 2008, (attached as Exhibit 11 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 257 pgs.
Iida, Jeanne, “The Back Office: Systems—Image Processing Rolls on as Banks ReapBenefits,” American Banker, Jul. 19, 1993, printed from the internet at <http://search.proquest.com/docview/292903245/14138420743684F7722/14?accountid=14 . . . >, on Oct. 19, 2013 (3 pgs).
Image Master, “Photo Restoration: We specialize in digital photo restoration and photograph repair of family pictures”, http://www.imphotorepair.com, Cited in U.S. Pat. No. 7,900,822, as downloaded Apr. 2007 (1 pg).
Investment Systems Company, “Portfolio Accounting System,” 2000, pp. 1-32.
IPhone Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: http://www.apple.com/newsroom/2008/07/14iPhone-App-Stire-Downloads-Top-10_Million-in-First-Weekend, 3 pgs.
ITU-R-M.1225, Guides for Evaluation of Radio Transmission Technologies for IMT-2000, dated 1997, located on the Internet at: https://www.itu.int/dms-pubrec/itu-r/rec/m/R-REC-M,1225-0-199702-I!!PDF-E.pdf, 60 pgs.
JBC, “What is a MICR Line?,” eHow.com, retrieved from http://www.ehow.com/about_4684793_what-micr-line.html on May 4, 2009 (2 pgs).
Jeffrey M. Lacker, Payment System Disruptions and the Federal Reserve Following Sep. 11, 2001, The Federal Reserve Bank of Richmond, (Dec. 23, 2003) (attached as Exhibit 19 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 55 pgs.
Johnson, Jennifer J., Secretary of the Board; Federal Reserve System, 12 CFR Part 229, Regulation CC; “Availability of Funds and Collection of Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2009, (89 pgs).
Joinson et al., Olympus E-30 Review (pts.1,4,8), Digital Photography Review, Mar. 24, 2009, located on the Internet at: www.dpreview.com/reviews/olympus30, 6 pgs.
Kendrick, Kevin B., “Check Kiting, Float for Purposes of Profit,” Bank Security & Fraud Prevention, vol. 1, No. 2, 1994 (3 pgs).
Kiser, Elizabeth K.; “Modeling the Whole Firm: The Effect of Multiple Inputs and Financial Intermediation on Bank Deposit Rates;” FEDS Working Paper No. 2004-07; Jun. 3, 2003; pp. 1-46 (46 pgs).
Knerr et al., The A2iA Intercheque System: Courtesy Amount and Legal Amount Recognition for French Checks in Automated Bankcheck Processing 43-86, Impedove et al. eds., 1997, 50 pgs.
Knestout, Brian P. et al., “Banking Made Easy” Kiplinger's Personal Finance Washington, Jul. 2003, vol. 57, Iss 7 (5 pgs).
Kornai Andras et al., “Recognition of Cursive Writing on Personal Checks”, Proceedings of International Workshop on the Frontiers in Handwriting Recognition, Cited in U.S. Pat. No. 7,900,822, as dated Sep. 1996, (6 pgs).
Lampert, Christoph et al., “Oblivious Document Capture and Real-Time Retrieval,” International Workshop on Camera Based Document Analysis and Recognition (CBDAR), 2005 (8 pgs). (Retrieved from: http://www-cs.ccny.cuny.edu/˜wolberg/capstone/bookwarp/LampertCBDAR05.pdf).
LEICA DIGILUX 2 Instructions located on the Internet: http://www.overgaard.dk/pdf/d2_manual.pdf (attached as Exhibit 2 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 95 pgs.
Levitin, Adam J., Remote Deposit Capture: A Legal and Transactional Overview, Banking Law Journal, p. 115, 2009 (RDC).
Liang, Jian et al., Camera-Based Analysis of Text and Documents: A Survey, International Journal on Document Analysis and Recongition, Jun. 21, 2005, 21 pages.
Luo, Xi-Peng et al., “Design and Implementation of a Card Reader Based on Build-In Camera,” Proceedings of the 17th International Conference on Pattern Recognition, 2004, 4 pages.
Masonson, L., “Check Truncation and ACH Trends—Automated Clearing Houses”, healthcare financial management associate, http://www.findarticles.com/p/articles/mLm3276/is_n7_v47/ai_14466034/print, 1993 (2 pgs).
Matthews, Deborah, “Advanced Technology Makes Remote Deposit Capture Less Risky,” Indiana Bankers Association, Apr. 2008 (2 pgs).
Metro 1 Credit Union, “Remote Banking Services,” hltp://ww\\i.metro1cu.org/metro1cu/remote.html, downloaded Apr. 17, 2007 (4 pgs).
Mitek systems, “Imagenet Mobile Deposit”, San Diego, CA, downloaded 2009 (2 pgs).
Mitek Systems: Mitek Systems Launches First Mobile Check Deposit and Bill Pay Application, San Diego, CA, Jan. 22, 2008 (3 pgs).
Mohl, Bruce, “Banks Reimbursing ATM Fee to Compete With Larger Rivals”, Boston Globe, Boston, MA, Sep. 19, 2004 (3 pgs).
Moreau, T., “Payment by Authenticated Facsimile Transmission: a Check Replacement Technology for Small and Medium Enterprises,” CONNOTECH Experts-conseils, Inc., Apr. 1995 (31 pgs).
MOTOMANUAL for MOTORAZR, located on the Internet at: https://www.cellphones.ca/downloads/phones/manuals/motorola-razr-v3xx-manual.pdf (excerpts attached as Exhibit 8 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 34 pgs.
MOTOMANUAL, MOTOROKR-E6-GSM-English for wireless phone, copyright 2006, 144 pgs.
Motorola RAZR MAXX V6 User Manual, located on the Internet at: https://www.phonearena.com/phones/Motorola-RAZR-MAXX-V6_id1680 (attached as Exhibit 7 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 36 pgs.
Nelson, B. et al., “Remote deposit capture changes the retail landscape,” Northwestern Financial Review, http://findarticles.com/p/articles/mi qa3799/is200607/ai_n16537250, 2006 (3 pgs).
Netbank, Inc., “Branch Out: Annual Report 2004,” 2004 (150 pgs).
Netbank, Inc., “Quick Post: Deposit and Payment Forwarding Service,” 2005 (1 pg).
NetDeposit Awarded Two Patents for Electronic Check Process, NetDeposit, Jun. 18, 2007, (1 pg).
Nikon Digital Camera D300 User's Manual, located on the Internet at: http://download.nikonimglib.comarchive2/iBuJv00Aj97i01y8BrK49XX0Ts69/D300_EU(En)04.pdf (attached as Exhibit 5 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 195 pgs.
Nixon, Julie et al., “Fiserv Research Finds Banks are Interested in Offering Mobile Deposit Capture as an,” Fiserv, Inc. Brookfield, Wis., (Business Wire), (Feb. 20, 2009), 2 pgs.
Nokia N95 8GB User Guide, copyright 2009, located on the Internet at: https://www.nokia.com/en_int/phones/sites/default/files/user-guides/Nokia_N95_8GB_Extended_UG_en.pdf (excerpts attached as Exhibit 9 from the Defendant Wells Fargo Bank, N.A.'s Answer dat4ed Aug. 14, 2018), 77 pgs.
Notice of Allowance from corresponding U.S. Appl. No. 12/545,127 dated Jan. 17, 2013 (7 pgs).
Notice of Allowance from corresponding U.S. Appl. No. 12/545,127 dated Oct. 15, 2014 (24 pgs).
Office Action dated Oct. 9, 2013 from corresponding U.S. Appl. No. 12/545,127 (7 pgs).
Office Action from corresponding U.S. Appl. No. 12/545,127 dated Apr. 9, 2014 (6 pgs).
Online Deposit: Frequently Asked Questions, http://www.depositnow.com/faq.html, Copyright 2008 (1 pg).
Onlinecheck.com/Merchant Advisors, “Real-Time Check Debit”, Merchant Advisors: Retail Check Processing Check Conversion, http://www.onlinecheck/wach/rcareal.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (3 pgs).
Oxley, Michael G., from committee on Financial Services; “Check Clearing For The 21st Century Act”, 108th Congress, 1st Session House of Representatives report 108-132, Jun. 2003 (20 pgs).
Oxley, Michael G., from the committee of conference; “Check Clearing For the 21st Century Act” 108th Congress, 1st Session Senate report 108-291, Oct. 1, 2003 (27 pgs).
Palacios, Rafael et al., “Automatic Processing of Brazilian Bank Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2002 (28 pgs).
Panasonic Operating Instructions for Digital Camera/Lens Kit Model No. DMC-L1K, https://www.panasonic.com/content/dam/Panasonic/support_manual/Digital_Still_Camera/English_01-vqt0-vqt2/vqt0w95_L1_oi.pdf (attached as Exhibit 4 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 129 pgs.
Patterson, Scott “USAA Deposit@Home—Another WOW moment for Net Banking”, NextCU.com, Jan. 26, 2007 (5 pgs).
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,818,090, dated Nov. 8, 2018, 90 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,336,517, dated Nov. 8, 2018, 98 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 8,977,571, dated Nov. 8, 2018, 95 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-23 of U.S. Pat. No. 8,699,779, dated Nov. 8, 2018, 101 pgs.
POP, ARC and BOC-A Comparison, Federal Reserve Banks, at 1(Jan. 7, 2009) available on the Internet at: https://web.archive.org/web/20090107101808/https://www.frbservices.org/files/eventseducation/pdf/pop_arc_boc_comparison.pdf (attached as Exhibit 13 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 3 pgs.
Public Law 108-100, 108 Congress; “An Act Check Clearing For the 21st Century Act”, Oct. 28, 2003, 117 STAT. 1177 (18 pgs).
Quinn and Roberds, The Evolution of the Check as a Means of Payment: A Historical Survey, Federal Reserve Bank of Atlanta, Economic Review, 2008, 30 pgs.
Rao, Bharat; “The Internet And The Revolution in Distribution: A Cross-Industry Examination”; Technology in Society; 1999; pp. 287-306; vol. 21, No. 3 (20 pgs).
Remotedepositcapture, URL:www.remotedepositcapture.com, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (5 pgs).
RemoteDepositCapture.com, “PNC Bank to Offer Ease of Online Deposit Service Integrated with QuickBooks to Small Businesses”, Remote Deposit Capture News Articles from Jul. 24, 2006, (2 pgs).
RemoteDepositCapture.com, Remote Deposit Capture News Articles from Jul. 6, 2006, “BankServ Announces New Remote Deposit Product Integrated with QuickBooks” (3 pgs).
Remotedepsitcapture.com, LLC, “Remote Deposit Capture Overview,” ROC Overview, http://remotedepositcapture.com/overview/RDC_overview.htm, Cited in U.S. Pat. No. 7,900,822, as dated Mar. 12, 2007 (4 pgs).
Richey, J. C. et al., “EE 4530 Check Imaging,” Nov. 18, 2008 (10 pgs).
Ritzer, J.R. “Hinky Dinky helped spearhead POS, remote banking movement”, Bank Systems and Equipment, vol. 21, No. 12, Dec. 1984 (1 pg).
Rivlin, Alice M. et al., Chair, Vice Chair—Board of Governors, Committee on the Federal Reserve in the Payments Mechanism—Federal Reserve System, “The Federal Reserve in the Payments Mechanism”, Jan. 1998 (41 pgs).
Rockwell, The Megapixel Myth, KenRickwell.com, 2008, located on the Internet at: http://kewrockwell.com.tech/mpmyth.htm, 6 pgs.
Rose, Sarah et al., “Best of the We: The Top 50 Financial Websites”, Money, New York, Dec. 1999, vol. 28, Iss. 12 (8 pgs).
Shah, Moore's Law, Continuous Everywhere But Differentiable Nowhere, Feb. 12, 2009, located on the Internet at: http://samjshah.com/2009/02/24/morres-law/, 5 pgs.
Shelby, Hon. Richard C. (Committee on Banking, Housing and Urban Affairs); “Check Truncation Act of 2003”, calendar No. 168, 108th Congress, 1st Session Senate report 108-79, Jun. 2003 (27 pgs).
Sony Digital Camera User's Guide/Trouble Shooting Operating Instructions, copyright 2005, located on the Internet at: https://www.sony.co.uk/electronics/support/res/manuals/2654/26544941M.pdf (attached as Exhibit 3 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 136 pgs.
SoyBank Anywhere, “Consumer Internet Banking Service Agreement,” Dec. 6, 2004 (6 pgs).
Sumits, Major Mobile Milestones—The Last 15 Years, and the Next Five, Cisco Blogs, Feb. 3, 2016, located on the Internet at: https://blogs.cisco.com/sp/mobile-vni-major-mobile-milesrones-the-last15-years-and-the-next-five, 12 pgs.
Teixeira, D., “Comment: Time to Overhaul Deposit Processing Systems,” American Banker, Dec. 10, 1998, vol. 163, No. 235, p. 15 (3 pgs).
Thailandguru.com: How and where to Pay Bills @ www.thailandguru.com/paying-bills.html (2 pgs).
The Automated Clearinghouse, “Retail Payment Systems; Payment Instruments Clearing and Settlement: The Automated Clearinghouse (ACH)”, www.ffiec.gov/ffiecinfobase/booklets/retailretail_02d.html, Cited in U.S. Pat. No. 7,900,822, as dated Dec. 2005 (3 pgs).
The Green Sheet 2.0: Newswire, “CO-OP adds home deposit capabilities to suite of check imaging products”, www.greensheet.com/newswire.php?newswire_id=8799, Mar. 5, 2008 (2 pgs).
Tygar, J.D., Atomicity in Electronic Commerce, In ACM Networker, 2:2, Apr./May 1998 (12 pgs).
U.S. Appl. No. 13/922,686, Office Action dated Oct. 16, 2013, 30 pages.
U.S. Appl. No. 12/545,127, Applicant's Appeal Brief dated Nov. 6, 2012, 21 pages.
U.S. Appl. No. 12/545,127, Office Action dated Apr. 4, 2012, 21 pages.
U.S. Appl. No. 12/545,127, Office Action dated Nov. 8, 2011,7 pages.
U.S. Appl. No. 12/545,127, Office Action dated Oct. 9, 2013, 7 pages.
U.S. Appl. No. 12/549,443, Applicant's Office Action Response dated Aug. 28, 2012, 11 pages.
U.S. Appl. No. 12/549,443, Office Action dated May 8, 2012, 9 pages.
U.S. Appl. No. 13/922,686, Office Action dated Apr. 25, 2014, 50 pages.
Valentine, Lisa, “Remote Deposit Capture Hot Just Got Hotter,” ABA Banking Journal, Mar. 2006, p. 1-9.
Vaream, Craig, “Image Deposit Solutions: Emerging Solutions for More Efficient Check Processing,” JP Morgan Chase, Nov. 2005 (16 pgs).
Wade, Will, “Early Debate on Remote-Capture Risk,” American Banker, New York, NY, May 26, 2004 (3 pgs).
Wade, Will, “Early Notes: Updating Consumers on Check 21” American Banker Aug. 10, 2004 (3 pgs).
Wallison, Peter J., “Wal-Mart Case Exposes Flaws in Banking-Commerce Split”, American Banker, vol. 167. No. 8, Jan. 11, 2002 (3 pgs).
Wausau Financial Systems, Understanding Image Quality & Usability Within a New Environment, 2006, 22 pgs.
Wells Fargo 2005 News Releases, “The New Wells Fargo Electronic Deposit Services Break Through Banking Boundaries In The Age of Check 21”, San Francisco Mar. 28, 2005, www.wellsfargo.com/press/3282005_check21Year=2005 (1 pg).
Wells Fargo Commercial, “Remote Deposit”, www.wellsfargo.com/com/treasury mgmtlreceivables/electronic/remote deposit, Copyright 2008 (1 pg).
White, J.M. et al., “Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction”, IBM J. Res. Development, Jul. 1983, vol. 27, No. 4 (12 pgs).
Whitney et al., “Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services”, American Bankers Association, May 18, 2004, http://www.aba/com/NR/rdonlyres/CBDC1 A5C-43E3-43CC-B733-BE417C638618/35930/DSTUFormat.pdf (2 pages).
Wikipedia ®, “Remote Deposit,” http://en.wikipedia.org/wiki/Remote_deposit, 2007 (3 pgs).
Windowsfordevices.com, “Software lets camera phone users deposit checks, pay bills”, www.windowsfordevices.com/news/NS3934956670.html, Jan. 29, 2008 (3 pgs).
Wolfe, Daniel, “Check Image Group Outlines Agenda,” American Banker, New York, N.Y.: Feb. 13, 2009, vol. 174, Iss. 30, p. 12. (2 pgs).
Woody Baird Associated Press, “Pastor's Wife got Scammed—She Apparently Fell for Overseas Money Scheme,” The Commercial Appeal, Jul. 1, 2006, p. A. 1.
Zandifar, A., “A Video-Based Framework for the Analysis of Presentations/Posters,” International Journal on Document Analysis and Recognition, Feb. 2, 2005, 10 pages.
Zhang, C.Y., “Robust Estimation and Image Combining” Astronomical Data Analysis Software and Systems IV, ASP Conference Series, 1995 (5 pgs).
Zions Bancorporation, “Moneytech, the technology of money in our world: Remote Deposit,” http://www.bankjunior.com/pground/moneytech/remote_deposit.jsp, 2007 (2 pgs).
Patent Disclaimer for U.S. Pat. No. 8,699,779, filed on Mar. 4, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No. 8,977,571, filed on Feb. 20, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No. 9,336,517, filed on Mar. 4, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No. 9,818,090, filed on Feb. 20, 2019, 2 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 75 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Tim Crews In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Matthew Calman In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Updated Exhibit List, dated Mar. 19, 2019, 8 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 91 pgs.
CBM2019-00003 U.S. Pat. No. 8,699,779, Declaration of Matthew Calman In Support of Patent Owner Preliminary Response, dated Mar. 4, 2019, 15 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 42.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 14, dated Apr. 10, 2019, 10 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Tim Crews In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 99 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Matthew Calman In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 43.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, United Services Automobile Association's (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 103 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779 Matthew A. Calman Declaration, dated Mar. 4, 2019, 15 pgs.
CBM2018-00005 U.S. Pat. No. 8,699,779 Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 147 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Petition For Covered Business Method Review of Claims 1-3, 5-9, 11-16 and 18 of U.S. Pat. No. 9,224,136, dated Mar. 28, 2019, 93 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Apr. 8, 2019, 3 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions And Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 94 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petition For Covered Business Method Review of Claims 1-30 of U.S. Pat. No. 10,013,681, dated Mar. 28, 2019, 99 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petitioner's Updated Exhibit List (as of Apr. 1, 2019) for U.S. Pat. No. 10,013,681, dated Apr. 1, 2019, 5 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Notice of Filing Date Accorded To Petition and Time For Filing Patent owner Preliminary Response for U.S. Pat. No. 10,013,681, dated Apr. 8, 2019, 3 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 76 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Petition For Covered Business Method Review of Claims 1-3, 5-14, 16-29 of U.S. Pat. No. 10,013,605, dated Mar. 28, 2019, 88 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions And Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petition For Inter Parties Review of Claims 109 of U.S. Pat. No. 9,818,090, dated Mar. 20, 2019, 56 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Declaration of Peter Alexander, PhD. as filed in the IPR on Mar. 20, 2019, 99 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Mar. 27, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Exhibit B Proposed Claim Constructions for the '571, '090, '779 and '517 Patents, filed on Feb. 28, 2019, 10 pgs.
ABA Routing System Transit Number, Wikipedia, dated Sep. 27, 2006, 3pgs.
Accredited Standards Committee Technical Report TR 33-2006, dated Aug. 28, 2006, 75 pgs.
ANS X9.100-140-2004, “Specification for an Image Replacement documentl—IRD”, American Standard for Financial Services, Oct. 1, 2004, 15 pgs.
ANSI News, Check 21 Goes Into Effect Oct. 28, 2004, dated Oct. 25, 2004, 1 pg.
ANSI, “Return Reasons for Check Image Exchange of IRDS”, dated May 6, 2016, 23 pgs.
ANSI, “Specifications For Electronic Exchange of Check and Image Data”, dated Jul. 11, 2006, 230 pgs.
ANSI X9.7-1999(R2007), Bank Check Background and Convenience Amount Field Specification, dated Jul. 11, 2007, 86 pgs.
ASCX9, “Specification for Electronic Exchange of Check and Image Data”, date Mar. 31, 2003. 156 pgs.
Bankers' Hotline, “Training Page: Learning the Bank Numbering System”, Copyright 2004, 2 pgs.
BrainJar Validation Algorithms, archived on Mar. 16, 2016 from BrainJar.com, 2 pgs.
Canon White Paper, “Two Words Every Business Should Know—Remote Deposit”, dated 2005, 7 pgs.
CBR online, “Diebold launches ATM depository technology”, Oct. 4, 2007, 5 pgs.
Cheq Information Technology White Paper, “Teller Scanner Performance and Scanner Design: Camera Position Relative to the Feeder”, dated 2005, 7 pgs.
De Jesus, Angie et al., “Distributed Check Processing In a Check 21 Environment”, dated Nov. 2004, 22 pgs.
Federal Reserve Adoption of DSTU X9.37-2003, Image Cash Letter Customer Documentation Version 1.8, dated Oct. 1, 2008, 48 pgs.
Fielding, R. et al, “RFC-2616—Hypertext Transfer Protocol”, Network Working Group, The Internet Society copyright 1999, 177 pgs.
Hill, Simon, “From J-Phone to Lumina 1020: A Complete History of the Camera Phone”, dated Aug. 11, 2013, 19 pgs.
Instrument—Definition from the Merriam-Webster Online Dictionary, dated Mar. 2, 2019, 1 pg.
Instrument—Definition of Instrument from the Oxford Dictionaries (British & World English), dated Jul. 2, 2017, 44 pgs.
IPhone Application Programming Guide Device Support, dated Apr. 26, 2009, 7 pgs.
IPhone Announces the New iPhone 3gs—The Fastest, Most Powerful iPhone Yet, Press Release dated Jun. 8, 2009, 4 pgs.
Klein, Robert, Financial Services Technology, “Image Quality and Usability Assurance: Phase 1 Project”, dated Jul. 23, 2004, 67 pgs.
Lange, Bill, “Combining Remote Capture and IRD Printing, A Check 21 Strategy For Community and Regional Banks”, dated 2005, 25 pgs.
Lee, Jeanne, “Mobile Check Deposits: Pro Tips to Ensure They Go Smoothly”, dated Feb. 19, 2016, 6 pgs.
Meara, Bob, “State of Remote Deposit Capture 2015: Mobile Is the New Scanner”, Dated May 26, 2015, obtained from the Internet at: https://www.celent.com/insights/57842967, 3 pgs.
Meara, Bob, “State of Remote Deposit Capture 2015 Mobile Is the New Scanner”, dated May 2015, 56 pgs.
Meara, Bob,“USAA's Mobile Remote Deposit Capture”, Dated Jun. 26, 2009, 2 pgs.
Mitek's Mobile Deposit Processes More Than Two Billion Checks, $1.5 Trillion in Cumulative Check Value, dated Mar. 18, 2018, 2 pgs.
Mitek, “Video Release—Mitek MiSnap™ Mobile Auto Capture Improves Mobile Deposit® User Experience at Ten Financial Institutions”, dated Jul. 15, 2014, 2 pgs.
NCR, Mobile Remote Deposit Capture (RDC), copyright 2011, 8 pgs.
Nokia N90 Review Digital Trends, dated Feb. 11, 2019, obtained from the Internet at: https://www.digitaltrends.com/cell-phone-reviews/nokia-n90-review/, 11 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 1 of 3, 67 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 2 of 3, 60gs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 3 of 3, 53 pgs.
Patel, Kunur, Ad Age, “How Mobile Technology Is Changing Banking's Future”, dated Sep. 21, 2009, 3 pgs.
Remote Deposit Capture Basic Requirements, dated Aug. 22, 2009, 1 pg.
Remote Deposit Capture.com Scanner Matrix, dated Oct. 21, 2011, 3 pgs.
Rowles, Tony, USAA-v. Wells Fargo No. 2:16-cv-245-JRGL e-mail correspondence dated Jan. 24, 2019, 2 pgs.
Sechrest, Stuart et al., “Windows XP Performance”, Microsoft, dated Jun. 1, 2001, 20 pgs.
Spenser, Harvey, “White Paper Check 21 Controlling Image Quality At The Point of Capture”, dated 2004, 7 pgs.
Timothy R. Crews list of Patents, printed from the United States Patent and Trademark Office on Feb. 13, 2019, 7 pgs.
Van Dyke, Jim, “2017 Mitek Mobile Deposit Benchmark Report”, copyright 2017, 50 pgs.
Wausau, “Understanding Image Quality & Usability Within a New Environment”, copyright 2019, 1 pg.
Whitney, Steve et al., “A Framework For Exchanging Image Returns”, dated Jul. 2001, 129 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 15, dated May 1, 2019, 7 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Defendant's Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Apr. 25, 2019, 36 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper 14, dated Apr. 30, 2019, 7 pgs.
USAA's Reply to Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 2, 2019, 15 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiff's Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 26, 2019, 18 pgs.
USAA's Reply Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245. dated May 2, 2019, 227 pgs.
Parties' P.R. 4-5(D) Joint Claim Construction Chart, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 9, 2019, 25 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Apr. 26, 2019, 5 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 28 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated May 15, 2019, 33 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 27 pgs.
USAA's Opening Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 17, 2019, 670 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 111 pgs.
Plaintiffs Notice of Filing Ciaim Construction Presentation, filed in Civil Action No. 2:18-CV-245, dated May 23, 2019, 106 pgs.
IPR2019-01081 U.S. Pat. No. 9,336,517, Petition for Inter Partes Review of Claims 1, 5-10, 12-14, 17-20 of U.S. Pat. No. 9,336,517, dated Jun. 5, 2019, 78 pgs.
IPR2019-01082 U.S. Pat. No. 8,977,571, Petition for Inter Partes Review of Claims 1-13 U.S. Pat. No. 9,336,517, dated Jun. 5, 2019, 75 pgs.
IPR2019-01083 U.S. Pat. No. 8,699,779, Petition for Inter Partes Review of Claims 1-18 U.S. Pat. No. 9,336,517, dated Jun. 5, 2019, 74 pgs.
Plaintiff's Notice of Decisions Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated Jun. 6, 2019, 61 pgs.
Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 13, 2019, 48 pgs.
Parties' P.R.4-5(D) Joint Claim Construction Chart, filed in Civil Action No. 2:18-CV-245, dated Jun. 14, 2019, 28 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
USAA'S Reply Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated Jun. 7, 2019, 14 pgs.
Wells Fargo's Objections to Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 7 pgs.
USAA's Objections to Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 6 pgs.
Parties' P.R. 4-5(D) Joint Claim Construction Chart, filed in Civil Action No. 2:18-CV-366, dated Jun. 18, 2019, 27 pgs.
IPR2019-00815, Invalidity Chart, uploaded on Jun. 27, 2019, 94 pgs.
IPR2019-00815, United Services Automobile Association (“USAA”)'s Patent Owner Preliminary Response, dated Jun. 27, 2019, 66 pgs.
IPR2019-00815, Supplemental Invalidity Chart, dated on Jun. 27, 2019, 16 pgs.
IPR2019-00815, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jun. 27, 2019, 25 pgs.
CBM 2019-00027, Declaration of Bharat Prasad, dated Jul. 8, 2019. 32 pgs.
CBM 2019-00027, Patent Owner Preliminary Response and Exhibits 2001-1042, dated Jul. 8, 2019, 91 pgs.
CBM 2019-00028, United Services Automobile Association (“USAA”)'s Patent Owner Preliminary Response, dated Jul. 8, 2019, 73 pgs.
CBM2019-00028, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 8, 28 pgs.
CBM2019-00028, Malykhina, Elena “Get Smart”, Copyright 2006 by ProQuest Information and Learning Company, 6 pgs.
CBM2019-00028, Palm Treo 700W Smartphone manual, Copyright 2005 by Palm, Inc., 96 pgs.
CBM2019-00028, 00000 C720w User Manual for Windows Mobile Smart Phone, Copyright 2006, 352 pgs.
CBM2019-00028, “Smarter Than Your Average Phone”, Copyright 2006 by Factiva, 4 pgs.
CBM2019-00028, “64 Million Smart Phones Shipped Worldwide in 2006”, Canalys Newsroom, 2006, 3 pgs.
CBM2019-00028, Nokia 9500 Communicator user Guide, Copyright 2006 by Nokia Corporation, 112 pgs.
CBM2019-00028, Robinson, Daniel, “Client Week—Handsets advance at 3GSM”, Copyright 2004 by VNU Business Publications Ltd., 2 pgs.
CBM2019-00028, Burney, Brett “MacBook Pro with Intel processor is fast, innovative”, Copyright 2006 by Plain Dealer Publishing Co., 2 pgs.
CBM2019-00028, 17-inch MacBook Pro User's Guide, Copyright 2006 by Apple Computer, Inc., 144 pgs.
CBM2019-00028, Wong, May “HP unveils new mobile computers”, Copyright 2006 by The Buffalo News, 2 pgs.
CBM2019-00028, Jewell, Mark “Cell Phone Shipments Reach Record 208M”, Copyright 2005 by Associated Press, 1 pg.
CBM 2019-00028, Lawler, Ryan “Apple shows Intel-based Macs, surge in revenue”, Copyright 2006 by The Yomiuri Shimbun, 2 pgs.
CBM 2019-00028, Aspire 9800 Series User Guide, Copyright 2006 by Acer International, 122 pgs.
CBM 2019-00028, Dell XPS M1210 Owner's Manual, Copyright 2006 by Dell Inc., 192 pgs.
CBM 2019-00028, Estridge, Bonnie “Isyour phone smart enough?: The series that cuts through the technobabble to bring you the best advice on the latest gadgets”, Coyright 2006 by XPRESS—A1 Nsr Media, 3 pgs.
CBM 2019-00028, “Motorola, Palm collaborate on smart phone”, Copyright 2000 by Crin Communications, Inc., 1 pg.
CBM 2019-00028, Nasaw, Daniel “Viruses Pose threat to ‘Smart’ Cellphones—Computer Programs Could Cripple Devices and Shut Down Wireless Networks”, Copyright 2004 by Factiva, 2 pgs.
CBM 2019-00028, Seitz, Patrick “Multifunction Trend Shaking Up The Handheld Device industry; Solid Sales Expected in 2004; PDA, handset, camera—one single, small product can fill a variety of roles”, Copyright 2004 Investor's Business Daily, Inc., 3 pgs.
Microsoft Mobile Devices Buyer's Guide, 2002, 4 pgs.
Microsoft Mobile Devices Smartphone, 2003, 2 pgs.
Plaintiff's Notice of Decision Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated May 15, 2019, 36 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated Jun. 24, 2019, 28 pgs.
CBM2019-00029, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Jul. 17, 2019, 76 pgs.
CBM2019-00029, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 17, 2019, 29 pgs.
CBM2019-00029, Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
CBM2019-00029, Palenchar, Joseph, “PDA Phone Adds WiFi VoIP, Turn-By-Turn GPS Navigation”, Copyright 2006 by Reed Business Information, 2 pgs.
CBM2019-00029, HP User Guide, Additional Product Information, Copyright 2006 by Hewlett-Packard Development Company, L.P., 204 pgs.
CBM2019-00029, Pocket PC User Manual, Version 1, dated May 2006 by Microsoft, 225 pgs.
CBM2019-00029, “Dynamism.com: Take tomorrow's tech home today with Dynamism.com: Latest gadgets merge next generation technology with high style design”, Copyright 2006 Normans Media Limited, 2 pgs.
IPR2019-00815, Federal Reserve Financial Services Retired: DSTU X9.37-2003, Specifications for Electronic Exchange of Check and Image Data, Copyright 2006 by Accredited Standards Committee X9, Inc., dated Mar. 31, 2003, 157 pgs.
IPR2019-01081, Declaration of Peter Alexander, Ph.D, dated Jun. 5, 2019, 135 pgs.
USAA's Opening Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Apr. 11, 2019, 32 pgs.
P.R. 4-3 Joint Claim Construction and Pre-Hearing Statement, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 5, 2019, 190 pgs.
Defendant Wells Fargo Bank, N.A.'s Amended Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 12, 2019, 32 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiff's Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Mar. 21, 2019, 36 pgs.
Defendant Wells Fargo Bank, N.A.'s Second Amended Answer, Affirmative Defenses, and Counterclaims To Plaintiff's Amended Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Aug. 1, 2019, 72 pgs.
Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Jul. 29, 2019, 36 pgs.
Wells Fargo's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 7 pgs.
USAA's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 10 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant To Authorization Provided In Paper No. 13, dated Aug. 1, 2019, 9 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Supplemental Exhibit List, dated Aug. 1, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, United Services Automobile Association (“USAA”)'s Sur-Reply In Support of Patent Owner Preliminary Response, dated Aug. 8, 2019, 8 pgs.
IPR201 9-00815 U.S. Pat. No. 9,818,090, Decision Denying Institution of Inter Parties Review, dated Aug. 26, 2019, 28 pgs.
Mitek Video titled “Mobile Deposit Tour”, Published on Jul. 2, 2009 by Mitek Systems, duration 2 minutes and 13 seconds, located on the Internet at: https://www.youtube.com/watch?v=sGD49ybxS2Q, 25 pgs.
Provisional patent application filed by Wells Fargo Bank, dated Jan. 29, 2008, 134 pgs.
SCH-0i910 Portable Dualmode Smartphone User Guide by Samsung, Copyright 2009 Samsung Electronics Canada, downloadable from www.manualslib.com, 168 pgs.
IPR2020-00092, U.S. Pat. No. 9,569,756, Petition for Inter Parties Review of Claims 1-7, 9-17, 29 and 30 of U.S. Pat. No. 9,569,756, dated Nov. 7, 2019, 74 pgs.
U.S. Appl. No. 61/022,279, dated Jan. 18, 2008, (cited in IPR2020-00090, U.S. Pat. No. 9,177,197), 35 pgs.
Herley, Cormac, “Recursive Method To Extract Rectangular Objects From Scans”, Microsoft Research, Oct. 2003, 4 pgs.
Panini My Vision X Operator Manual, Panini, 2004, (cited in IPR2020-00093. U.S. Pat. No. 9,892,454), 51 pgs.
Tochip, E. et al., “Camera Phone Color Appearance Utility”, Matlab at Stanford University, 2007, 25 pgs.
Yeo, L.H. et al., “Submission of transaction from mobile workstations in a cooperative multidatabase environment”, IEEE, 1994, (cited in IPR2020-00097, U.S. Pat. No. 7,885,880), 10 pgs.
Andrew S. Tanenbaum, Modern Operating Systems, Second Edition (2001).
Arnold et al, The Java Programming Language, Fourth Edition (2005).
Consumer Assistance & Information—Check 21 https://www.fdic.gov/consumers/assistance/protection/check21.html (FDIC).
Halonen et al., GSM, GPRS, and EDGE Performance: Evolution Towards 3G/UMTS, Second Edition (2003).
Heron, Advanced Encryption Standard (AES), 12 Network Security 8 (2009).
Immich et al., Performance Analylsis of Five Interprocess CommunicAtion Mechanisms Across UNIX Operating Systems, 68 J. Syss. & Software 27 (2003).
Leach, et al., A Universally Unique Identifier (UUID) URN Namespace, (Jul. 2005) retrieved from https://www.ietf.org/rfc/rfc4122.txt.
N. Ritter & M. Ruth, The Geo Tiff Data InterchAnge Standard for Raster Geographic Images, 18 Int. J. Remote Sensing 1637 (1997).
Pbmplus—image file format conversion package, retrieved from https://web.archive.org/web/20040202224728/https:/www.acme.com/software/pbmplus/.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01071, 106 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-7, 10-21 and 23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01074.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, dated Jul. 21, 2021, IPR2021-01076, 111 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, filed Jul. 21, 2021, IPR2021-01077; 100 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-30 of U.S. Pat. No. 10,013,681, filed Aug. 27, 2021, IPR2021-01381, 127 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of U.S. Pat. No. 10,013,605, filed Aug. 27, 2021, IPR2021-01399, 113 pages.
Readdle, Why Scanner Pro is Way Better Than Your Camera? (Jun. 27, 2016) retrieved from https://readdle.com/blog/why-scanner-pro-is-way-better-than-your-camera.
Santomero, The Evolution of Payments in the U.S.: Paper vs. Electronic (2005) retrieved from https://web.archive.org/web/20051210185509/https://www.philadelphiafed.org/publicaffairs/speeches/2005_santomero9.html.
Schindler, Scanner Pro Review (Dec. 27, 2016) retrieved from https://www.pcmag.com/reviews/scAnner-pro.
Sing Li & Jonathan Knudsen, Beginning J2ME: From Novice to Professional, Third Edition (2005), ISBN (pbk): 1-59059-479-7, 468 pages.
Wang, Ching-Lin et al. “Chinese document image retrieval system based on proportion of black pixel area in a character image”, the 6th International Conference on Advanced Communication Technology, 2004, vol. 1, IEEE, 2004.
Zaw, Kyi Pyar and Zin Mar Kyu, “Character Extraction and Recognition for Myanmar Script Signboard Images using Block based Pixel Count and Chain Codes” 2018 IEEE/ACIS 17th International Conference on Computer and Information Science (CS), IEEE, 2018.
Jung et al, “Rectangle Detection based on a Windowed Hough Transform”, IEEE Xplore, 2004, 8 pgs.
Craig Vaream, “Image Deposit Solutions” Emerging Solutions for More Efficient Check Processing, Nov. 2005, 16 pages.
Certificate of Accuracy related to Article entitled, “Deposit checks by mobile” on webpage: https://www.elmundo.es/navegante/2005/07/21/empresas/1121957427.html signed by Christian Paul Scrogum (translator) on Sep. 9, 2021.
Fletcher, Lloyd A., and Rangachar Kasturi, “A robust algorithm for text string separation from mixed text/graphics images”, IEEE transactions on pattern analysis and machine intelligence 10.6 (1988), 910-918 (1988).
IPR 2022-00076 filed Nov. 17, 2021 on behalf of PNC Bank N.A., 98 pages.
IPR 2022-00075 filed Nov. 5, 2021 on behalf of PNC Bank N.A., 90 pages.
IPR 2022-00050 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 126 pages.
IPR 2022-00049 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 70 pages.
About Network Servers, GlobalSpec (retrieved from https://web.archive.org/web/20051019130842/http://globalspec.com80/LearnMore/Networking_Communication_Equipment/Networking_Equipment/Network_Servers (“GlobalSpec”).
FDIC: Check Clearing for the 21st Century act (Check21), FED. Deposit Ins. Corp., Apr. 25, 2016 (retrieved from https:/web.archive.org/web/20161005124304/https://www.fdic.gov/consumers/assistance/protection/check21.html (“FDIC”).
Bruno-Britz, Maria “Mitek Launches Mobile Phone Check Capture Solution,” Bank Systems and Technologies Information Week (Jan. 24, 2008).
V User Guide, https://www.lg.com/us/support/manualsdocuments?customerModelCode=%20LGVX9800&csSalesCode=LGVX9800, select“VERISON(USA) en”; The V_UG_051125.pdf.
MING Phone User Manual, 2006.
Patel, Kunur, “How Mobile Technology is Changing Banking's Future” AdAge, Sep. 21, 2009, 4 pages.
Spencer, Harvey, “Controlling Image Quality at the Point of Capture” Check 21, Digital Check Corporation & HSA 2004.
Moseik, Celeste K., “Customer Adoption of Online Restaurant Services: A Multi-Channel Approach”, Order No. 1444649 University of Delaware, 2007, Ann Arbor ProQuest., Web. Jan. 10, 2022 (Year: 2007).
Continuations (6)
Number Date Country
Parent 16712182 Dec 2019 US
Child 16831276 US
Parent 16280455 Feb 2019 US
Child 16712182 US
Parent 15792966 Oct 2017 US
Child 16280455 US
Parent 15392950 Dec 2016 US
Child 15792966 US
Parent 13922686 Jun 2013 US
Child 15392950 US
Parent 12545127 Aug 2009 US
Child 13922686 US