System, computing device, and method for document detection and deposit processing

Information

  • Patent Grant
  • 11900755
  • Patent Number
    11,900,755
  • Date Filed
    Monday, November 30, 2020
    3 years ago
  • Date Issued
    Tuesday, February 13, 2024
    2 months ago
Abstract
An image of a check is captured by an imaging device and processing of the digital image of the check for deposit at a remote server may be accomplished with a downloaded software application on a portable computing device associated with the imaging device. The downloaded application may include one or more trained machine learning models for processing the captured image. The portable computing device may utilize deterministic algorithms for certain image processing tasks and machine learning models for others. The selection between machine learning and deterministic processing may be made locally on the portable device or in response to instructions from an institution server to use a particular processing method.
Description
BACKGROUND

Electronic check image detection and remote check depositing are part of technological improvements to the manual process of check depositing known as remote deposit. In general, checks provide a convenient method for an individual such as a payor to transfer funds to a payee. To use a check, an individual usually opens a checking account, or other similar account, at a financial institution and deposits funds, which are then available for later withdrawal. To transfer funds with a check, the payor usually designates a payee and an amount payable on the check. In addition, the payor often signs the check. Once the check has been signed, it is usually deemed negotiable, meaning the check may be validly transferred to the payee upon delivery. By signing and transferring the check to the payee, the payor authorizes funds to be withdrawn from the payor's account on behalf of the payee.


While a check may provide a payor with a convenient and secure form of payment, receiving a check may put certain burdens on the payee, such as the time and effort required to deposit the check. For example, depositing a check typically involves going to a local bank branch and physically presenting the check to a bank teller.


To reduce such burdens for the payee, as well as the payor, remote deposit technology has been developed to include systems and methods that enable the remote deposit of checks that overcome the need for some of the manual processes in prior check deposit procedures. However, existing remote deposit systems typically utilize specific, deterministic algorithms to analyze the various features of digital images of checks that are captured by remote devices of account holders and transmitted to financial institutions for deposit. The deterministic algorithms, while typically a good tool for handling digital images having a fixed set of characteristics, may not be as accurate as desired in identifying features or qualities of captured images over a wider variation of characteristics, and thus may lead to delays or failures in the processing of remote deposits.





DESCRIPTION OF THE FIGURES

The present disclosure may be better understood with reference to the following drawings and description. Non-limiting and non-exhaustive descriptions are described with reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating principles. In the figures, like referenced numerals may refer to like parts throughout the different figures unless otherwise specified.



FIG. 1 shows an exemplary system for implementing remote deposit.



FIG. 2 shows an exemplary digital image depicting a check and a background.



FIG. 3 shows an exemplary modified digital image depicting a check and a background including edge coordinates.



FIG. 4 shows an exemplary segmented digital image depicting a check and a background.



FIG. 5 shows an exemplary grayscale histogram of a digital image.



FIG. 6 shows a block diagram of a set of machine learning models that may be stored in the machine learning model repository illustrated in FIG. 1



FIG. 7 shows a flow diagram of a model training technique for training a ML model to identify bounding corners and crop an image of a check.



FIG. 8 shows a dewarping machine learning model.



FIG. 9 shows a denoising machine learning model.



FIG. 10 shows a duplicate check detection process.



FIG. 11 shows a duplicate check detection machine learning model training diagram.



FIG. 12 shows a block diagram of an exemplary computer architecture of a computing device or institution computer.



FIG. 13 shows a flow diagram of an image processing feature.



FIG. 14 shows a flow diagram of an image processing feature.



FIG. 15 shows a flow diagram of a method of determining suitability of a computing device for including machine learning models and deterministic algorithms for image processing tasks.



FIG. 16 shows a flow diagram of a method of determining which machine learning model or deterministic algorithm for image processing tasks to select for a particular remote deposit transaction.



FIG. 17 shows a flow diagram of a method of using a first machine learning model to select or download a second machine learning model or deterministic model for processing a check image.





DETAILED DESCRIPTION

The methods, devices, and systems discussed below may be embodied in a number of different forms. Not all of the depicted components may be required, however, and some implementations may include additional, different, or fewer components from those expressly described in this disclosure. Variations in the arrangement and type of the components may be made without departing from the spirit or scope of the claims as set forth herein.


Within the technological field of remote deposit, a payor may present a check, either in paper or electronic versions, to a payee for deposit. To deposit the check with a financial account at a financial institution associated with the payee, the payee may scan a check in a digital image using a scanner, digital camera, or other imaging device in communication with a computing device. A financial institution may then receive from the payee the digital image of the check. The financial institution may then use the digital image to credit funds to the payee. Such remote deposit techniques utilize the efficient and accurate detection of a check depicted within a digital image, where part of the check image detection includes detecting and extracting the check image from background portions included in the digital image that are not part of the check image.


Traditional, deterministic, image processing algorithms can fail to clearly and reliably detect the border of a check. Such a failure can prevent successful processing of check contents and the failure of traditional algorithms can often be the result of insufficient contrast between the background portions and the check in a captured image. Another common challenge for successful image processing to find a check in an image with traditional algorithms is the presence of reflections or shadows in the image. As described herein, a machine learning model that may be more resilient over a range of color and lighting conditions than any single deterministic technique can be trained to identify a check from the background by, for example, identifying the bounding box coordinates (corners).


Yet another challenge presented by processing images captured of checks is that of de-morphing or de-warping a check image. Distortions in an image can occur when there is a large angle between the camera and the check, or where portions of the check are folded and appear at a different angle than another portion of the check in the image. Again, traditional, deterministic algorithms exist that can address certain of these distortions, however a deep learning (also referred to herein as machine learning) model is disclosed herein that may be trained to learn and apply the necessary translation, scaling, rotation and shearing of an image in a more robust manner that stand-alone algorithms.


Issues with identifying information within a check image may also be a challenge for systems that capture and process check images. For example, the printed text on a check often cannot be read correctly due to background noise. Traditional systems and deterministic algorithms can have difficulty consistently removing noise from the printed text. This noise may be in the form of the overlap of a signature or other handwritten notes on the printed text, reflections or shadows that are present in the image, creases or wrinkles in the check, and so on. Here again, a machine learning model may be generated and trained to identify and remove background noise so that a clean image is left for processing.


Yet another check processing task that institutions perform is duplicate check detection. Members of a bank or other financial institution may attempt to deposit the same check multiple times, either fraudulently or by accident. Existing techniques and systems for handling duplicate check detection typically involve a database of check numbers with a limited lookback window, or simply involve manual check reviews. These existing techniques are often subject to missed detection opportunities, and are ultimately susceptible to fraud losses. As also described herein, a machine learning model may be trained to recognize duplicate checks in a greater range of conditions than may be available in current techniques. In one implementation, a perceptual hash of an image may be encoded from the image that allows for an efficient image similarity check on a large scale that can handle many lighting and shadow variations.


The image processing steps for detecting and extracting the check image from the background portions may be implemented in whole, or in parts, on either a user device having captured the digital image, a remote server running an application for remote deposit, or both. Implementing these image processing steps are known to expend significant computing resources on a computing device, and so there are benefits to offloading, or sharing, some or all of the image processing between the user device and the remote server. However, with the improvements to the computing capabilities of mobile computing devices, it is within the scope of this disclosure for one or more, or all, of the image processing features to be implemented directly on the mobile computing device (e.g., user device). For example, the ability to select between existing deterministic algorithm and trained machine learning models may be implemented at the initial remote deposit application download stage, where the mobile computing device and/or remote server device communicate to exchange mobile computing device capabilities.


In one implementation, only a limited set of total available deterministic or machine learning enables techniques that the mobile computing device can efficiently be expected to execute will be transferred to the mobile device with the remote deposit application. In other implementations, the complete set of deterministic and trained machine learning models for image processing may be included in the downloaded app and the mobile computing device may select which of the algorithms or machine learning models to apply for a given image capture and processing session. Alternatively, the machine learning model need not be persistent on the mobile device and may be temporarily downloaded, on an as-needed basis, via a web interface or a previously downloaded application programming interface (API). Additionally, the download of the machine learning model(s) to the phone or other mobile device need not be directly from the financial institution or third party remote check deposit handling service. Instead, the model(s) or deterministic algorithms may be downloaded to the phone or mobile device from a local network inside a user's home.


In the disclosed embodiments, features are disclosed for dedicating image processing from the remote server to the user device, as well as for dedicating image processing from the user device to the remote server. The determination of where the image processing will be implemented may be determined, for example, according to detected attributes of the captured digital image depicting the check, computing capabilities of the mobile device including the image capture device, and/or environmental conditions (e.g., ambient light levels) detected when the digital image was captured. Given the processing advantage typically afforded a server-based system as compared to a remote device, such as a handheld tablet or smart phone, the bulk of the processing may be described below as primarily taking place on a server of an institution rather than on the remote device of a user. However, some or all of the functionality and circuitry may be included on the user device in different implementations. Also, with respect to the training of deep learning (machine learning) models, the processing power of a remote server is generally used, regardless of the location of the processing that later implements the trained model. In some alternative implementations, training for one or more of the models discussed herein may be performed at the user device. The training of a machine learning model can be updated with localized data from end users. The end user data may be transmitted back to a central server periodically or in real-time for periodic or real-time training updates to better optimize the model.



FIG. 1 shows a block diagram of an implementation of a system 100 in which example embodiments may be implemented. A user 105 is shown along with an institution system 205. The institution system 205 may be affiliated with an institution 200, which may be any type of entity capable of processing a transaction involving a negotiable instrument, such as processing checks and/or providing funds associated with checks. For example, the institution 200 may be a financial services institution such as a retail bank, an investment bank, an investment company, a regional branch of the Federal Reserve, a clearinghouse bank and/or a correspondent bank. A representative 185 of the institution 200 may provide assistance, via a representative computing device 187, during any one or more of the processes described herein.


A negotiable instrument typically includes a type of contract that obligates one party to pay a specified sum of money to another party. Negotiable instruments may include checks, money orders, cashier's checks, drafts, bills of exchange, promissory notes, and the like. A check instructs a financial institution to pay an amount of money from a specific account held in the payor's name with that financial institution to an account held in the payee's name. A money order is a trusted financial instrument that is a payment order for a pre-specified amount of money. It is a more trusted method of payment than a personal check because the funds for the amount shown on the money order must be prepaid. A cashier's check (also known as a bank check, official check, teller's check, bank draft or treasurer's check) is a check guaranteed by a bank and may be purchased from a bank. Cashier's checks are usually treated as cash since most banks clear them instantly.


For example, the computing device 109 may be a personal computer (PC), a laptop computer, a handheld computing device, a personal digital assistant (PDA), a mobile phone, or a smartphone, for example. The computing device 109 includes an image capture device 115 for capturing an image of the check, where the image capture device 115 may be a digital camera, image scanner, or other device in which the image of the check 107 may be obtained. In some embodiments, the image capture device 115 may be camera with multiple different lenses, or multiple separate cameras each with its own lens. Each camera or camera lens of a multiple lens image capture device 115 may be controlled individually or concurrently to capture one or more images of a check at the same time or in sequence. Each camera lens may be the same or different to capture the image of a check from differing angles or offsets from each other camera lens. Alternatively or additionally, the lenses and/or the associated camera sensor or circuitry in a multiple camera embodiment of an image capture device 115 may be configured to capture a different portion of the electromagnetic spectrum, for example each lens or camera sensor may be calibrated to a respective one of visible frequencies, infrared frequencies or ultraviolet frequencies.


The user 105 may be an individual or entity who has an account 165 held at the institution 200 and is accessible via the institution system 205. The account 165 may be any type of account for depositing funds, such as a savings account, a checking account, a brokerage account, and the like. Although only one account 165 is shown, it is contemplated that the user 105 may have any number of accounts held at the institution 200. The user 105 may deposit a check 107 or other negotiable instrument document into the account 165 either electronically or physically. The institution 200 may process and/or clear the check 107, as well as other types of negotiable instruments.


The user 105 may remotely electronically deposit the check 107 at the institution 200 via the computing device 109 being operated by the user 105. For example, an application (sometimes referred to herein as an “app”) for remotely electronically depositing checks may be downloaded, from a third party provider such as the APPLE App Store or from the institution system 205, and installed on the computing device 109, where running the application on the computing device 109 enables the user 105 to remotely electronically deposit the check 107. It is noted that although examples and implementations described herein may refer to a check, the techniques and systems described herein are contemplated for, and may be used for, any negotiable instrument, such as a money order, a cashier's check, a check guaranteed by a bank, or the like. Similarly, the techniques and systems described herein are contemplated for and may be used with any form or document whose image may be captured with a scanner, camera, or other imaging device for subsequent storage and/or processing.


The user 105 may access the institution 200 via the institution system 205 by opening a communication pathway via a communications network 140 using the computing device 109. The communications network 140 may be representative of an intranet, the Internet, a local area network (LAN), a wide area network (WAN), a wireless fidelity (Wi-Fi) network, a public switched telephone network (PSTN), a cellular network, a voice over Internet protocol (VoIP) network, or the like. The user 105 may also communicate with the institution 200 or the institution system 205 by phone, email, instant messaging, text messaging, web chat, facsimile, postal mail, and the like.


There may be several ways in which the communication pathway may be established, including, but not limited to, an Internet connection via a website 218 of the institution system 205, where the website 218 includes content 219 for accepting remote deposits. The user 105 may access the website 218 and log into the website 218 using credentials, such as, but not limited to, a username and a password.


The user 105 may operate the image capture device 115 installed on the computing device 109 to generate a digital check image of the check 107. The digital check image may be used to create a digital image file 135 that is sent to the institution system 205 and used by the institution 200, in conjunction with the institution system 205, to process a deposit of the check 107 whose image is depicted within the digital image file 135. In an implementation, the digital image file 135 may be augmented by secondary data which may be information relating to the deposit of the check 107, such as an account number and a deposit amount, for example.


When capturing the digital check image, the check 107 may be placed on a background, such that the digital check image that is captured by the image capture device 115 comprises an image of the check (e.g., a check image) and a portion of the background (e.g., a background image). Any background may be used, although a dark background or a consistently colored background may provide more optimal results for distinguishing the check image from the background image within the overall initial image. In an implementation, white and yellow backgrounds may not be used. It is noted that although examples and implementations described herein may refer to a check image and check data, the term “check image” may refer to the check in a digital image (as opposed to the background image) and the term “check data” may refer to any data in a digital check image (as opposed to background data). Thus, “check image” and “check data” may refer to the non-background portion of the image and data from the non-background portion of the image in implementations involving any negotiable instrument, form, or document.


In an implementation, the digital check image generated by the image capture device 115 comprises check data and background data. The check data pertains to the check image in the digital check image and the background data pertains to the background in the digital check image on which the image of the check 107 is disposed. An example of such a digital check image 201 is further described with respect to FIG. 2. The digital check image 201 illustrated in FIG. 2 is comprised of a check image 210 and a background image 250 surrounding the check image 210. The check image 210 is shown to include check data such as name, address, date, payment instructions, payment amount, memo, signature line, and Magnetic Ink Character Recognition (MICR) line information. The background image 250 includes image data for the surface, or whatever other environment, is behind the check 107 when the image capture device 115 captured the digital check image 201. Between the check image 210 and the background image 250, are edges of the check 107. An edge separates the check image 210 from the background image 250, and edge detection techniques will be described in more detail herein.


In an implementation, a minimum ratio of check image size (or area) to background image size (or area) may be provided in the digital check image generated by the image capture device 115. Examples include, and are not limited to, at least a minimum portion (e.g., five percent) of the digital check image may comprise a check image. Other examples of check image to background size include at least a minimum portion (e.g., 1 percent) of the digital check image may comprise a background image, and the check image may be surrounded by at least one pixel of background all the way around the check image, or other predetermined requirements.


Referring back to the computing device 109 shown in FIG. 1, the computing device 109 may comprise an image processing engine 120 that executes image processing features used during the remote deposit process. The image processing engine 120 shown in FIG. 1 includes background processor circuitry 121, a chroma key analyzer 122, an image comparator 123, a grayscale converter 124, an image segmenter 125, segment processor circuitry 126, a segment recombiner 127, and image processor circuitry 128. Additionally, one or more machine learning models may be included in a model repository 129 of the image processing engine 120 that can be executed by the image processor circuitry 128 to handle one or more image processing tasks described herein rather than use a discrete, deterministic algorithm.


When a deterministic algorithm is selected for processing the digital check image 201 depicting the check 107, the background processor circuitry 121 may perform various tasks with respect to the background image 250 included on the digital check image 201. The background image 250 may be a surface on which the check 107 is placed when the digital check image 201 is captured by the image capture device 115. In conjunction with the image capture device 115, the background processor circuitry 121 detects portions of the digital check image 201 corresponding to the background image 250, and distinguishes the background image 250 from the check image 210. The background processor circuitry 121 may further generate a modified digital check image that replaces the background image with a replacement background image. The replacement background image may be selected from one or a plurality of predetermined background images. For example, a predetermined background image may be comprised of a solid color. In an implementation, the background processor circuitry 121 may store the background image 250 and/or the modified digital check image in a storage memory associated with the computing device 109.


Based on the background image 250 being separately detected from the check image 210 according to the techniques applied by the background processor circuitry 121, the image processing engine 120 may identify specific edge coordinates of the check image 210, such as edge coordinates 301, 302, 303, 304 illustrated in FIG. 3. A modified digital check image 300 may be generated by the image processing engine 120 that includes the edge coordinates, where the modified digital check image 300 is included in the digital image file 135 transmitted to the institution system.


An example of an existing deterministic algorithm, as compared to the machine learning model developed from deep learning approaches discussed in greater detail below, is first discussed. In an implementation, the deterministic approach to finding edges of a check may utilize chroma key technology to detect edge coordinates 301, 302, 303, 304 of the check image 210, and/or provide a uniform or consistent replacement background for the check image 210 in the digital check image 201. Chroma key is a technique for mixing two images together, in which a color or color range from one image is removed (or made transparent), revealing another image behind it. This technique is also known as color keying, green screen, and bluescreen. The chroma key analyzer 122 may use any known chroma key technique for removing the original background image 250 and replacing it with (e.g., revealing) a replacement background image. Based on the background image 250 being separately detected from the check image 210 according to the chroma key technique(s) applied by the chroma key analyzer 122, the image processing engine 120 may identify specific edge coordinates for the check image 210, such as the edge coordinates 301, 302, 303, 304 illustrated in FIG. 3. The modified digital check image 300 may be generated by the image processing engine 120 that includes the edge coordinates, where the modified digital check image 300 is included in the digital image file 135 transmitted to the institution system. Although the edge coordinates 301, 302, 303, 304 are illustrated to correspond to corners of the check image, other edge coordinates may correspond to non-corner edge portions that comprise an outer perimeter of the check.


Alternatively or additionally to the deterministic chroma key approach, the background image 250 may be separately distinguished from the check image 210 within the digital check image 201 based on a difference in image characteristics between the background image 250 and the check image 210 analyzed by the image comparator 123. After generating a modified digital check image that replaces the background image 250 with a predetermined replacement background image, the image comparator 123 may compare the modified digital check image with the predetermined replacement background image. By making this known image comparison, the image comparator 123 determines that overlapping the overlapping image portions correspond to a background portion, and the remaining image portions correspond to the check image 210. The image comparator 123 may then determine this difference and subtract the background portion from the modified digital check image to result in obtaining the check image 210 alone without background portions. Based on the check image 210 being separately extracted from the digital check image 201 according to the image comparison technique(s) applied by the image comparator 123, the image processing engine 120 may identify specific edge coordinates for the check image 210, such as the edge coordinates 301, 302, 303, 304 illustrated in FIG. 3. The modified digital check image 300 may be generated by the image processing engine 120 that includes the edge coordinates, where the modified digital check image 300 is included in the digital image file 135 transmitted to the institution system.


In addition, or alternatively, edge detection for the check image may be implemented by comparing the digital check image 201 to a predetermined rectangular shape that represents an outline of the check. This way, the check image 210 may be separately distinguished from the background image 250 within the digital check image 201 based on a matching of the predetermined rectangular shape with edge outlines of the check image 210 analyzed by the image comparator 123. After matching the predetermined rectangular shape to the edge outlines of the check image 210, the image comparator 123 may identify the edge coordinates of the check image 210. Based on the check image 210 being separately identified from the digital check image 201 according to the image comparison technique(s) applied by the image comparator 123, the image processing engine 120 may identify specific edge coordinates for the check image 210, such as the edge coordinates 301, 302, 303, 304 illustrated in FIG. 3. The modified digital check image 300 may be generated by the image processing engine 120 that includes the edge coordinates, where the modified digital check image 300 is included in the digital image file 135 transmitted to the institution system.


According to some embodiments, image processing, such as the edge detection, may be applied to segmented pieces of the digital check image 201. The grayscale converter 124 may convert the digital check image 201 generated by the image capture device 115 to grayscale using known techniques. In photography and computing, a grayscale digital image is an image in which the value of each pixel is a single sample, that is, it carries only intensity information. Images of this sort are composed exclusively of shades of gray, varying from black at the weakest intensity to white at the strongest. Conversion of a color image to grayscale is well known and any known technique(s) may be used.


The grayscale image generated by the grayscale converter 124 may be provided to the image segmenter 125, where the image segmenter 125 divides the image into a predetermined number of segments, such as 4 segments, 6 segments, 8 segments, or other predetermined number. Each segment may comprise a portion of the check image 210 (i.e., the check data) and a portion of the background image 250 (i.e., the background data) and an edge between the two portions. The segments may be equal in size or may differ in size. An example of a segmented image 400 divided into four segments is shown in FIG. 4.


The segment processor circuitry 126 may process each of the segments to identify and/or remove background data. In an implementation, histograms may be used to detect or identify background data and check data. The histogram of each segment can be used to determine an edge of the check image 210 from the background image 250 within the segment, so that the background image 250 may be removed or disregarded for subsequent processing.


A histogram is a well-known graph and may be used to display where all of the brightness levels contained in an image are found, from the darkest to the brightest. These values may be provided across the bottom of the graph from left (darkest) to right (brightest). The vertical axis (the height of points on the graph) shows how much of the image is found at any particular brightness level. An example of a histogram 500 for a segment of an image comprising check data and background data is further described with respect to FIG. 5. An edge between the check image 210 and the background image 250 may be identified to be those points on the digital check image corresponding to the threshold point (i.e., low point) reflected on the histogram 500 shown in FIG. 5.


The segment recombiner 127 recombines the processed segments to generate a recombined image comprising a depiction of the check 107. Based on the recombined image, the image processing engine 120 may identify specific edge coordinates for the check depicted in the recombined image, such as the edge coordinates 301, 302, 303, 304 illustrated in FIG. 3. The segment recombiner 127 may process the recombined image (including the edge coordinates) to be included with the digital image file 135 transmitted to the institution system 205 for deposit of the check 107 depicted in the modified digital check image.


In addition to, or instead of, the deterministic techniques for various image processing tasks, such as the edge detection and corner identification noted above, the image processing engine 120 may include a machine learning model repository 129 containing one or more trained machine learning models for each desired image processing task. Referring to FIG. 6, which illustrates one example of a machine learning model repository 129, several different trained machine learning models (ML models 1 through N) 602 may be included. Each of the ML models 602 in the model repository 129 may consist of a model parameter set 604 and the functional relationship component 606. As described below, the parameter set 604 contains the parameter data, that was achieved by training the model 602, and the parameter set 604 is used to process the input data (e.g. check image data from the captured image depicting the check) according to the functional relationship component 606. The functional relationship component 606 may include the type of matrix manipulation or convolution algorithm to be applied to the model parameter set 604 and the input data, and the basic instructions for how the input data and model parameters 604 are to be applied to the input data by the image processing circuitry 128. The model parameter set 604 is data that has been trained in a separate model training process prior to being made available for download to, and storage on, the computing device 109. For example, each ML model 602 may be trained at a central location and then downloaded as part of the remote deposit app from a 3rd party app store or the institution.


In order to improve on standard algorithmic/deterministic techniques for identifying the check 210 from the background and finding the corners or edges of the check, an ML model, such as an image cropping model 700 (See FIG. 7), may instead be implemented. In other words, rather than use one or more of the deterministic algorithm techniques discussed above for finding the corners and/or edges (e.g., via the chroma key, image comparator with background replacement and document segmentation described above) an image cropping model having been separately trained using deep learning techniques such as discussed in FIG. 7 may be downloaded as part of a downloadable app or software executable by the computing device 109. The final trained model may include the parameter set 604 that results from the training process and the relationship components 606 for the image cropping model. The relationship components may include the functions to be applied to the model parameters and digital image file, such as convolutions and/or matrices for manipulating and applying the parameters to the digital image of the check captured by, and being remotely deposited by, the computing device 109.


Referring to FIG. 7, one embodiment of an image cropping model 701 may be trained using an efficient binarized convolutional network (XNOR-Net) to accurately identify bounding box coordinates for cropping the check in the digital check image. Each of a plurality of test check images, where the correct bounding box coordinates 702 of the check 210 inside the image 201 are already known, may be processed using the binarized convolutional network to “train” or converge on system parameters for the model that may then be applied to any random check to accurately identify the bounding box coordinates of that random check. The robustness of the image cropping model 700 be strengthened by using a large number of training checks that represent a very large variety of conditions, including different shadows, colors, lighting, creases and fold lines, and so on. The binarized convolutional network technique trains this model by taking a feature and performing a convolution across the test check. A predetermined number of features are each processed via convolution to create a plurality of feature maps, one for each feature.


In one implementation of training this image cropping model 700, a feature may be a two dimensional array of pixels (for example a 3×3 array of 9 total pixels) with a particular pattern. The pattern may be, for example, a specific number and location in the 3×3 array of dark pixels. That feature is then convolved with different 3×3 windows 704 of pixels of the test check and the result is stored in the first level feature map 706 for that feature. Essentially, the 3×3 window 704 of a feature is shifted by one pixel and convolved with the test check pixels at that location until the window has been scrolled through the entire test check image. This process of convolution is repeated for each feature (each different 3×3 pixel pattern in this example) until all of the predetermined features have had a respective feature map 706 generated of a plurality of features. Although the feature sets used in this step may include every possible permutation of numbers dark pixel and patterns for a 3×3 array, the computational power and time necessary to handle all permutations may be unnecessarily large and so only a fixed number of patterns, which may be predefined by the particular machine learning model, are typically convoluted with the test check into respective feature maps.


After creating the initial feature maps 706, the process of training the image cropping model illustrated in FIG. 7 next proceeds with a subsampling step 708 to achieve resolution reduction and more global feature mapping. The subsampling step 708 automatically reduces each region of the first feature map, for example taking each 50×50 portion of the each feature map and reducing it to a 5×5 array. A new set of features for that 5×5 array are then put through convolutions resulting in a new set of feature maps 710 for each of those new features, and this process of subsampling and then processing new feature convolutions is repeated a predetermined number of times set for the model. The output of the image cropping model for the test check is a set of bounding box coordinates 702 that are then compared to the known bounding box coordinates for that test check. When the coordinates of the model in training do not match the actual known coordinates, the processor of the computer system training the model will adjust one or more features (e.g., swap out a feature originally used with a different feature at one or more of the different convolution stages), and reprocess the test check to see if the next result is closer to the known correct result for the bounding box coordinates. Which feature is selected, and the amount and magnitude of the changes the computer system training the model applies to that particular feature, may be accomplished using a variety of algorithms. In one implementation, a gradient descent algorithm may be used to make changes and iterate until the parameters of the model converge to most accurately match the known output of the test check. This process is then repeated for each of the test checks used to train the model 700. The final feature set 712 achieved at the end of the training of the model 700 is then fixed and form the parameter set of a trained model that is stored with the functional relationship information on how to apply the model parameters to future check image information.


Similarly, rather than use a discrete, deterministic algorithm to de-skew or de-warp check images, the model repository 129 of the image processing engine 120 may also include a trained demorphing transformation machine learning model 800. Referring to FIG. 8, the demorphing model 800 may be trained by providing training data consisting of images of checks 802 with skew or warp issues and known de-skewed, de-warped results. Any of a number of base algorithms for the training of this particular deep learning model may be implemented. As one example of a suitable candidate base algorithm for use in training the demorphing model 800, a spatial transformation network algorithm, such as described in Max Jaderberg et al. “Spatial Transformation Networks” June 2015 (available from https://arxiv.org/abs/1506.02025v1) may be used. As with the edge and corner detection of the image cropping model described above, the demorphing model may be trained with many different examples of training data inputs: differing warp or skew inputs and combined with different lighting, noise or other features. The resulting output may be a spatial transformation matrix 804 to apply to the digital image of the check to achieve the final image, or even a final de-skewed/de-warped image 806 in more sophisticated trained models.


As illustrated in FIG. 9, another trained model that may be included in the model repository 129 of the image processing engine 120 is a denoising and binarization model 900. In order to recognize printed material on a check, an optical character recognition (OCR) process is typically applied during check processing. A challenge in the remote deposit of checks is the fact that, in many instances, there exists information that has been handwritten on the check that may obscure important printed information. For example, a signature or a memo line entry on a check may overlap the MICR information (which contains routing or account number information). This may, in turn, cause errors or general failure of an OCR process and thus cause a remote deposit attempt to fail. Other types of image noise that may cause an OCR process to fail include reflections or shadows across the text, or a crease or wrinkle crossing printed text. Instead of using typical determinative algorithms, a machine learning model may be trained to identify and remove this background noise.


As illustrated in FIG. 9, an example 902 of handwriting “noise” obscuring MICR text is shown, along with the desired output 904 of a denoising process prior to OCR being applied. Deterministic algorithms may be attempted to identify and remove specific types of noise, however a deep learning model may be trained and disseminated that may handle a wider variety of input noise variations in a faster and more effective manner.


As shown in FIG. 9, a server or other computing device may train a suitable denoising model 900 by using training data and processing that training data through one or more of convolutional auto-encoders, cycle-consistent adversarial networks and capsule networks 906. The training checks used for this particular machine learning model 900 may include checks with different parts of the printed data obscured and the known correct printed data available to adjust the parameters during the training.


Yet another trained model that may be included in the model repository 129 of the image processing engine 120 is a duplicate check detection model 1000. Unlike the image cropping model 700, the demorphing model 800 and the denoising and binarization model 900, the duplicate check detection model 1000 is trained determine if a currently processed check is similar enough to be flagged as a duplicate of a previously deposited check. As illustrated in FIG. 10, a new check image 1002 processed in the computing device 120 is essentially compared to prior images of previously deposited checks 1004 and, if the new check image is very close to a prior deposited check image, then the remote deposit process may be stopped, or the transaction flagged 1006 for future closer inspection. Unlike a deterministic algorithm, which might simply compare MICR numbers of prior checks and the new check, the duplicate check detection model 1000 may be trained to analyze a large number of feature similarities, for a large number of different lighting or other differences, in addition to being able to recognize MICR information.


As shown in FIG. 11, the duplicate check detection model 1000 may be trained using deep auto encoders. The encoding process may be a perceptual hash (or pHash) technique 1100 that transforms original check image 1102 into a hashed representation that allows for a family of slight variations in the appearance of a check (e.g., different lighting, folds, tears, etc.) to be clustered in a node representing the variations of a learnt check 1104 that allows for a more robust comparison of an input check to prior check information. Application of the duplicate check detection model 1000 will allow the remote server receiving the check image for the mobile computing device to cease check processing and send a rejection message to the mobile computing device for display on the mobile computing device, or to flag the transaction for later review, when the new check image 1002 is determined to be a duplicate of a previously deposited check image 1004. If implemented on the mobile computing device, rather than the institution server or other central computing device, then the transmission of the device may alert the user to a potential problem and avoid sending the check information to the institution or other central server, automatically prevent transmission of the image or check information from the mobile device, or send an alert to the institution or central server to cease processing, when the trained model indicates that the new check image is identified as associated with a duplicate check.


As noted earlier with respect to FIG. 1, in an implementation, some or all of the image processing described as being implemented by one or more components of the image processing engine 120, may be performed by the institution system 205 after the institution system 205 receives the digital image file 135 from the computing device 109.


The image processor circuitry 128 of the computing device 109, alone or in combination with other of the components comprising the image processing engine 120, may operate to implement one or more image processing features such as to remove warping or de-warp a digital image, to crop a digital image, to de-skew a digital image (e.g., rotate the image to horizontal or vertical alignments), to identify the corners/edges within a digital image, or adjust brightness, tint, contrast, or other image attribute of a digital image, or other known digital image processing capability. In one implementation, the image processor circuitry 128 may include a single processing element, or multiple separate processing elements, such as embedded processors including specialized graphics processors known as graphics processing units (GPUs) available from NVIDIA Corporation of Santa Clara, Calif.


The user 105 may thus generate a digital check image of the check 107 using the image capture device 115 on the computing device 109. For example, after endorsing the check 107, the user 105 may operate the image capture device 115 to capture a digital image of the front sides and/or back sides of the check 107 and store the digital image(s) as a digital image file on a storage memory of the computing device 109. The images of the front side and the back side of the check 107 may be processed using the techniques described herein. The images may be processed as separate files or as images in a single file. The images of the front side and the back side of the check 107 may be captured sequentially, e.g., pursuant to the user 105 flipping the check 107 over after an image of the front of the check 107 has been captured.


The digital image file 135 comprising the digital image depicting the check 107 may be transmitted to the institution system 205. The digital image may be the original digital image captured by the image capture device 115, or a modified digital image having one or more image processing techniques applied. For example, the modified digital image may include the edge coordinates 301, 302, 303, 304 described herein. The computing device 109 transmits the digital image file 135, which may include secondary data, to the institution system 205 along with a request to deposit the check 107 into an account, such as the account 165. In an implementation, the user 105 may attach the digital image file 135 to an email, short message service (SMS) text, or other form of electronic message, and send the digital image file 135 to the institution system 205 using the computing device 109. However, other techniques for sending a digital image file 135 to the institution system 205 may be used, such as providing a digital image file 135 from storage to the website 218 associated with the institution system 205.


The institution 200 in conjunction with the institution system 205 may process the deposit request according to the digital image and any secondary data included in the digital image file 135. Thus, the institution 200 in conjunction with the institution system 205 may process the digital image file 135 comprising the digital image depicting the check 107 for deposit.


In an implementation, the institution system 205 may extract the check image 210 from the digital image file 135 and process the check 107 from the digital image for deposit, according to any one or more of the image processing techniques described herein. The institution system 205 may use only machine learning models, such as described in FIGS. 7-11, to process the check. The institution system may alternatively use a mix of deterministic algorithm and machine learning models, may selectively use deterministic or machine learning models that differ from those used at the computing device, or may utilize a mixture of machine learning models for some processing functions and deterministic algorithms for other functions. Any image processing technology, software, or other application(s) may be used to retrieve the digital image of the check 107 from the digital image file 135 and to obtain the relevant data of the check 107 from the digital image file 135. The institution system 205 may determine whether the financial information associated with the check 107 is valid. Also, because of the dynamic nature of prior check deposit information, in one embodiment only the institution system may apply the machine learning model for checking duplicate check detection.


Upon receipt and processing of the digital image file 135 and approval of the check 107 associated therewith, the institution 200 may credit the funds of the check 107 to the account 165. It will be appreciated that the examples herein are for purposes of illustration and explanation only, and that an embodiment is not limited to such examples.


In an implementation, the computing device 109 may be a mobile device that comprises a digital camera which can capture a digital image of the check 107 by taking a picture of the front and/or back of the check 107. The back of the check may provide endorsement verification, such as the signature of the person or party the check is made out to. The user 105 may send the digital image file 135, including the check image, to the institution system 205 using the mobile device. An exemplary computing device 109 is described with respect to FIG. 6. It is contemplated that any device that is capable of generating a digital image may be used to make a digital image of the check 107 which may be processed for sending to the institution system 205 as a digital image file 135. Additional devices that may be used in the generation and/or transmission of a digital image include a digital camera, a photocopier, a fax machine, and the like, for example.


The institution system 205 may include any combination of systems and subsystems such as electronic devices including, but not limited to, computers, servers, databases, or the like. The electronic devices may include any combination of hardware components such as processors, databases, storage drives, registers, cache, random access memory (RAM) chips, data buses, or the like and/or software components such as operating systems, database management applications, or the like. According to an embodiment, the electronic devices may include a network-based server that may process the financial information and may receive the digital image file 135 from the computing device 109.


The electronic devices may receive the digital image file 135 and may perform an initial analysis on the quality of the image of the check 107 in the digital image file 135, the readability of the data contained therein, or the like. For example, the electronic devices may determine whether the amount payable and other information may be readable such that it may be obtained and processed by the institution system 205 to credit the account 165 associated with the user 105.


The institution system 205 may include a user interface circuitry 220, an image processor 222, and a data source access engine 227. The user interface circuitry 220 may generate and format one or more pages of content 219 as a unified graphical presentation that may be provided to the computing device 109 or a representative computing device 187. In an implementation, the page(s) of content 219 may be provided to the computing device 109 and/or the representative computing device 187 via a secure website 218 associated with the institution system 205.


In an implementation, the institution system 205 may use the image processor 222 to process the digital image file 135 comprising the image(s) 137 of the check 107 received from the user 105 for use by the institution 200 in the processing and/or clearance of the check 107. The image processor 222 may process multiple frames of the image if the image is comprised of multiple frames (e.g., the front side and the back side of a check).



FIG. 12 illustrates an exemplary computer architecture 1200 for a computing device such as any one of the computing device 109, institution system 205, or another computing device. The computer architecture 1200 includes system circuitry 1202, display circuitry 1204, input/output (I/O) interface circuitry 1206, and communication interfaces 1208. The graphical user interfaces (GUIs) 1205 displayed by the display circuitry 1204 may be representative of GUIs generated by the application for remote deposit. The GUIs may be displayed locally using the display circuitry 1204, or for remote visualization, e.g., as HTML, JavaScript, audio, and video output for a web browser running on a local or remote machine such as the computing device 109 or institution system 205.


The GUIs 1205 and the I/O interface circuitry 1206 may include touch sensitive displays, voice or facial recognition inputs, buttons, switches, speakers and other user interface elements. Additional examples of the I/O interface circuitry 1206 includes microphones, video and still image cameras, headset and microphone input/output jacks, Universal Serial Bus (USB) connectors, memory card slots, and other types of inputs. The I/O interface circuitry 1206 may further include magnetic or optical media interfaces (e.g., a CDROM or DVD drive), serial and parallel bus interfaces, and keyboard and mouse interfaces.


The communication interfaces 1208 may include wireless transmitters and receivers (“transceivers”) 1210 and any antennas 1212 used by the circuitry of the transceivers 1210. The transceivers 1210 and antennas 1212 may support Wi-Fi network communications, for instance, under any version of IEEE 802.11, e.g., 802.11n or 802.11ac, or other wireless protocols such as Bluetooth, Wi-Fi, WLAN, cellular (4G, LTE/A). The communication interfaces 1208 may also include serial interfaces, such as universal serial bus (USB), serial ATA, IEEE 1394, lighting port, I2C, slimBus, or other serial interfaces. The communication interfaces 1208 may also include wireline transceivers 1214 to support wired communication protocols. The wireline transceivers 1214 may provide physical layer interfaces for any of a wide range of communication protocols, such as any type of Ethernet, Gigabit Ethernet, optical networking protocols, data over cable service interface specification (DOCSIS), digital subscriber line (DSL), Synchronous Optical Network (SONET), or other protocol. The communication interfaces 1208 may communicate with remote computing devices via a network, such as the communications network 140.


The system circuitry 1202 may be representative of any combination of hardware, software, firmware, application programming interface, or other circuitry for implementing the features of the remote deposit application described herein. For example, the system circuitry 1202 may be implemented with one or more systems on a chip (SoC), application specific integrated circuits (ASIC), microprocessors, discrete analog and digital circuits, and other circuitry. The system circuitry 1202 may implement any of the image processing features described herein. As an example, the system circuitry 1202 may include one or more processors 1216 and memory 1220.


The memory 1220 stores, for example, control instructions 1223 for executing the features of the remote deposit application running on the computing device 109, and/or institution system 205, as well as an operating system 1221. In one implementation, the processors 1216 execute the control instructions 1223 and the operating system 1221 to carry out any of the features for the remote deposit application. For example, the control instructions 1223 for the remote deposit application includes instructions that, when executed by the processors 1216, implement the features corresponding to the image processing engine 120, which may include the deterministic image processing algorithms and or the ML models discussed previously. The memory 620 also includes control parameters 1222 that provide and specify configuration and operating options for the control instructions 1223 (such as the image processing technique selection instructions discussed herein), operating system 1221, and other functionality of the computer architecture 1200.



FIG. 13 shows a flow diagram 1300 describing one implementation of image processing features by the image processing engine 120 as part of a remote deposit solution. The image processing features described by the flow diagram 1300 includes those relating to edge/corner detection for determining coordinates of the edges of a check depicted in a digital image.


At 1301, an account owner (e.g., payee to an amount written on a check document, referred to herein as a user) may receive a check from a third party (e.g., payor of the check), and may endorse the check by signing the back of the check in the designated field. If the user wishes to deposit the check into an account, such as a savings and/or checking account, the user may also write in an account number below the signature on the back of the check.


At 1302, the user may operate the computing device 109 to open a communication pathway with the institution system 205 associated with an account of the user for depositing funds. The communication pathway may be implemented by logging into a website operating within the institution system 205, for example. In addition to, or alternatively, a remote deposit application may be downloaded onto the computing device 109. Executing the remote deposit application to run on the computing device 109 may initiate opening of the communication pathway. The communication pathway may be established after the user authenticates access to the institution system 205 using user specific authentication credentials such as, but not limited to, a username and a password, or biometric information such as fingerprints, voice recognition, or facial recognition information.


At 1303, a digital check image of the check may be captured by the image capture device 115. The user may manually operate the image capture device 115 to capture the digital check image to include the check and a background portion. According to some embodiments, the remote deposit application running on the computing device may automatically capture the digital check image when a set of predetermined criteria is determined to be satisfied. For example, the remote deposit application may control the image capture device 115 to capture the digital check image after detecting the check is aligned within an alignment guide displayed on a display screen of the computing device 109, where the display screen displays a field of view of the image capture device 115. According to another example, the remote deposit application may control the image capture device 115 to capture the digital check image after detecting the check is in focus for the digital image capture by the image capture device 115. According to another example, the remote deposit application may control the image capture device 115 to capture the digital check image after detecting the check comprises at least a predetermined portion of the digital check image to be captured by the image capture device 115.


Also at 1303, a deposit request is initiated. The deposit request may include selecting an account in which to deposit the check. In an implementation, the user may select a “deposit check” option provided on a graphical user interface (GUI) displayed on the computing device 109. The user may also input one or more of the following check information through the GUI: check amount, date, the account the check funds should be deposited in, comments, routing number, or other check data.


At 1304, edge detection and corner identification is executed by the image processing engine 120 according to any one or more of the edge detection processes described herein. In one implementation, the cropping machine learning model 700 is executed by processing the captured digital image using the trained model. The edge detection process includes distinguishing the check image from the background image that comprises the overall digital check image captured by the image capture device 115. By executing the edge detection on the digital check image, edge coordinates for the check image are determined. For example, the edge detection may determine the edge coordinates 702 as shown in FIG. 7.


According to some other embodiments, deterministic algorithms may be used instead of applying the machine learning model. In those other embodiments, the edge detection implemented at 1304 may include one or more of the following processes: scaling down the digital check image for faster processing (scale factor depends on image size), converting the digital check image to grayscale, using close and/or open morphs on the digital check image to eliminate noise, binarizing the digital check image using a dynamic threshold algorithm (e.g., Otsu algorithm), running through multiple cropping logic algorithms, e.g., using OpenCV, until a “check” is found within the digital check image (e.g., 4 predetermined algorithms may be applied). To determine if a crop was successful, the crop result may be run through a crop validator that checks aspect ratio, area, skew, zoom, and/or other image analysis for validating a successful crop. The resulting digital check image may be run through focus detection to determine if the digital check image is blurry, and a refocusing process may be applied if blurriness is detected within a predetermined range. After multiple “successful” digital check images have been processed, the final digital check image is ready (e.g., considered a modified digital check image), along with four corner check coordinates and/or an image size of the modified digital check image.


At 1305, a modified digital check image (e.g., modified digital check image 300) is generated that includes the edge coordinates (e.g., bounding box coordinates 702 from the cropping model as shown in FIG. 7 or edge coordinates 301, 302, 303, 304 from a deterministic model as shown in FIG. 3).


At 1306, the modified digital check image may undergo one or more further image processing steps. The image processing steps may include any one or more of the image processing functions attributed to the image processor 128, as described herein. These one or more additional image processing steps may include only processing the modified image via machine learning models such as those described above, only processing the image using techniques based on deterministic algorithms, or a mixture of both.


At 1307, the resulting processed image is stored within a digital image file. According to some embodiments, the digital image file may also include supplemental information, as described herein. At 1308, the digital image file is transmitted to the institution system 205. After receipt by the institution system 205, the institution system may further process the digital image file, and deposit the funds corresponding to the check depicted in the digital image file. The digital image file may include the digital check image in a known digital image format (e.g., a JPEG digital image file). According to some embodiments, the digital image file may further include edge coordinate information for edges detected within the digital check image by the image processing engine 120 at the computing device 109. The edge coordinate information may be in the following exemplary formats (exemplary coordinates are provided):


{


“frontImageCoordinates”: [

    • 186.0,
    • 282.0,
    • 1368.0,
    • 294.0,
    • 1374.0,
      • 828.0
      • 174.0,
      • 798.0


],


“frontImageWidth”: “1440”,


“backImageWidth”: “1500”,


“frontImageHeight”: “1080”,


“backImageHeight”: “1000”


}


Further description of the processes that may be implemented on the digital image file by the institution system is described with reference to the flow diagram 800 shown in FIG. 14. According to the flow diagram 1400 shown in FIG. 14, the institution system 205 receives the digital image file from the computing device 109 at 801.


At 1402, edge detection is executed by the institution system 205 according to any one or more of the edge detection processes described herein. The edge detection process results in distinguishing the check image from the background image that comprises the overall digital check image captured by the image capture device 115. By executing the edge detection on the digital check image, edge coordinates for the check image are determined. For example, the edge detection may determine bounding box coordinates 702 from the trained machine learning cropping model as shown in FIG. 7 or edge coordinates 301, 302, 303, 304 from a deterministic model as shown in FIG. 3.


According to some embodiments, the edge detection executed by the institution system 205 may be different from the edge detection executed by the image processing engine 120 of the computing device 109. In one implementation, the edge detection at the institution system may be a machine learning model such as discussed above, while the edge detection at the computing device may be a deterministic algorithm version. In other implementations, the computing device and the institution system may both use different trained machine learning models, or may each use different deterministic algorithm edge detection techniques.


For example, with respect to the deterministic techniques of edge detection, the edge detection at 1402 at the institution system 205 may include one or more of the following: scaling the digital check image by a predetermined amount (e.g., scale down the digital check image by a half (½)), converting the digital check image to a gray scale image, using close and/or open morphs on the digital check image to eliminate noise, binarizing the digital check image using a dynamic threshold algorithm (e.g., Otsu's method), and/or applying Hough transform algorithm to identify lines in the digital check image. The edge detection at 1402 may further include iterating through all the identified lines from the digital check image, and discarding away any lines that do not correspond to a check edge, where a check edge may be determined according to one or more of the following rules: vertical lines must be angled between 95 and 85 degrees or between 275 and 265 degrees, horizontal lines must be angled between 195 and 175 degrees or between 360 and 355 degrees, discard away lines that lie completely in the top fourth or bottom fourth of the image as these lines have a high likelihood of being image noise data, discard away lines that lie completely in the right fourth or left fourth of the image as these lines have a high likelihood of being image noise data, and/or find the average distance between every line and the center of the digital check image and throw away any outlier lines that are more than a predetermined distance from the center of the digital check image. The edge detection at 1402 may further include creating a bounding box around remaining lines after the line removal process, as the remaining lines have a high likelihood of actually corresponding from the check depicted in the digital check image. If creating the bounding box around the remaining lines does not produce a check21 valid image, then another attempt to create a bounding box around contours of the digital check image may be implemented. The contours may be detected according to a contour detection algorithm such as the Suzuki85 algorithm.


At 1403, conditions detected during the edge detection implemented by the image processing engine 120 may be compared to conditions detected during edge detection implemented by the institution system 205. The conditions may include any environmental conditions (e.g., brightness levels) that may have affected the quality of the digital check image captured by the image capturing device 115. In addition or alternatively, as the edge detection implemented by the institution system 205 may be different from the edge detection implemented by the image processing engine 120 of the computing device 109, an accuracy of the edge detection implemented by the image processing engine 120 may be compared to an accuracy of the edge detection implemented by the institution system 205.


At 1404, the institution system 205 selects one of the edge coordinates from the edge coordinates included in the digital image file received from the computing device 109, or the edge coordinates determined by the institution system 205. The institution system 205 may select the edge coordinates based on predetermined criteria such as, for example, related to the detected conditions, or related to the determined accuracy of the edge coordinates. When the selection is based on the detected conditions, the institution system 205 may select the edge coordinates determined by the image processing engine 120 on the computing device 109 when the conditions indicate the edge detection processes implemented by the image processing engine 120 would result in a greater accuracy than the edge detection processes implemented by the institution system 205. When the selection is based on the accuracy of the edge detection, the institution system 205 may select the edge coordinates determined to be more accurate.


At 1405, the institution system 205 identifies the check image from the digital image file based on the selected edge coordinates. From the identified check image, the check data contained within the check image may be extracted.


At 1406, the institution system 205 processes the check depicted in the check image to extract the check data contained therein. Processing of the digital image file may include retrieving financial information regarding the check. The financial information may comprise the MICR number, the routing number, a check amount, or other information found on the check. The processing step may include implementation of one of more of ML models such as the dewarping 800, or denoising 900, models discussed above, or may implement deterministic techniques using known algorithms, before extracting financial information from the image.


At 1407, after retrieving the financial information from the check in an electronic data representation form, the institution system 205 determines whether the financial information is valid. For example, the institution system 205 may perform an initial analysis on the quality of the data representation, the readability of the data representation, or the like. For example, the electronic devices may determine whether the account number, amount payable, or the like may be readable such that they may be parsed and processed by the institution to credit an account associated with the user. If the financial information is determined to be valid, the electronic data representation may be processed by the institution system 205, thereby depositing the money in the user's account 165. If the financial information is determined to be invalid, then the user may be advised. For example, the institution system 205 may transmit an email, a web message, an instant message, or the like to the user indicating that the financial information associated with the electronic data representation may be invalid. The user may determine how to proceed by selecting an option on the web message, replying to the email, or the like.


The remote deposit software application, or app, that a user computing device 109 downloads from a third party or institution may be customized by the third party or institution. In one embodiment, the ability for a computing device 109 that is user operated, such as a smartphone, tablet or other computing device, to handle certain machine language models for image processing or duplicate check detection may be determined when the app is requested, such that the computing device is only provided those features or types of image processing software that the computing device can properly execute. For example, the hardware and operating system information for the computing device 109 may be provided to the third party or institution system 205 and the computing device 109 capabilities assessed by the third party app provider or institution. Referring to FIG. 15, when the third party app provider or institution is queried by the computing device for a remote deposit app, or for an update to an existing remote deposit app, the information on the computing device capabilities may be looked up in a database at the third party or institution (at 1502). Alternatively, the query may include the processor of the computing device 109 automatically generating the computing device capabilities in a message that is provided to, or requested by, the third party or institution system 205. The data may include model information, processor information, operating system, memory availability and so on.


Upon receipt of the information, the institution system 205 may determine if the computing device of the user is capable of handling the processing and memory requirements for each machine learning model that the institution system 205 or third party app store can include in the downloadable remote deposit app (at 1504). The institution system 205 may, for example, be able to select from one or more machine learning models and deterministic models and match the computing device capabilities with only machine learning models, in type or number, that the computing device can effectively handle (at 1506). Alternatively, or in addition, to the computing device capability check, the institution system 205 may selectively include machine learning models or deterministic models in the remote deposit app that will differ from any machine learning or deterministic algorithm techniques the institution system 205 will itself be using on check images sent via the app during a remote deposit process.


After selecting the machine learning modes and/or deterministic technique instructions the computing device 109 can best utilize, the downloadable app may be configured to include those models or techniques and the institution system 205 may transmit them over the network connection to the computing device (at 1508). In yet other implementations, the downloadable app may be configured by the institution server 205 with a plurality of machine learning models and/or deterministic algorithm instructions at the time the app is first downloaded, and the institution server 205 may dynamically select which of the machine learning models or deterministic techniques to activate at the computing device 109 at the time a remote deposit query is initiated by the computing device.


In another embodiment illustrated in FIG. 16, when the downloaded app stored on the computing device already includes more than one ML model and/or deterministic algorithm technique for a given image processing task, for example a ML model and a deterministic technique are included in the app that are each capable of separately handling background detection and cropping, the computing device may select which particular method to use. The process may start when the computing device receives user input to initiate the remote deposit app for depositing a new check (at 1602). The remote deposit app would then be executed by the processor of the computing device to provide instructions, which may be text and/or graphics, to the user to properly locate and align the check in the viewing arear of a camera or other image capture mechanism associated with the computing device (at 1604). The image that is then captured by manual input from the user or automated capture by the device is received in memory at the computing device (at 1606). The computing device 109 may then determine the state of the captured image, such as the amount of contrast, lighting variations and so on (at 1608). Based on the determined state, the computing device 109 may select which of the two or more image processing techniques available for a particular image processing task should be selected (at 1610). For example, a quality measurement below a predetermined threshold would automatically cause the computing device to utilize a machine learning model for the task and a quality at or above the predetermined threshold would trigger use of a deterministic algorithm for the task. This may take place for only one of the typical image processing tasks, such as edge detection/cropping, or for more than one of the image processing tasks for which there are more than one ML model or deterministic algorithm available in the downloaded app. The captured image may then be processed at the computing device using the selected image processing too (ML model or deterministic algorithm) (at 1612).


In an alternative embodiment, the process of FIG. 16 may be modified to both select the optimal machine learning model and to adjust the settings of the computing device to default to a particular set-up configuration. Referring to FIG. 17, the computing device 109 may download a remote capture application that includes a first machine learning model trained to select the optimal one of a plurality of trained image processing machine learning models (at 1702). This first trained model that recognizes the optimal second machine learning model may be part of a previously downloaded application, but the second machine learning model may be one that is selected from a plurality of trained models and later downloaded once the first machine learning model identifies the optimal second machine learning model. The first machine learning model may be one that has been trained to recognize the current environment the computing device 109 and check to be scanned are in, and for then selecting the second machine learning model that is best suited for handling image processing in view of the recognized environment. The environmental factors may include lighting and shadow detected, the type computing device used to capture the check image, and so on. This first machine learning model may also be trained to recognize specific check type (personal check, bank check, etc.) and check design type (e.g. striped, solid, personalized patterns, logos, cartoon characters and so on) and to factor in selection of the optimal second machine learning model for image processing not only optimized for the recognized environment, but for the specific type and design of the check.


The first trained machine learning model may be executed on the computing device 109 to capture an initial image of the check and automatically determine the image capture environment (for example, lighting, shadows, skew and/or noise) for the check that is captured, and may also may determine the type of check and the check design in the image (at 1704). Based on the information it has been trained to look for regarding image capture environment and check type and design, the first machine learning model may determine if there is an optimal trained second machine learning model that may be used to handle image processing and extract the check information (at 1706). The determination of whether a second model is “optimal” may be a comparison of percentage success rates, for the determined environment and check type/design, for a known plurality of specialized second machine learning models (each trained more rigorously for particular different environment or check type/design) in comparison to a default second trained model or deterministic technique that already may be part of the earlier downloaded application. If there is no second trained model that matches the determined environment and/or check type/design such that it would be more optimal than a more generalized default second machine learning model already present on the computing device 109, then the default second machine learning model may be used (at 1712). If there is a better match to the environment or check type/design in a second machine learning model known to the first machine learning model, then that optimal second machine learning model may be downloaded to the computing device 109 and used to process the captured image (at 1708, 1710).


In yet other embodiments, alone or in combination with any of the above, it is further contemplated that a trained machine learning model may also output computing device specific instructions to alter one or more settings of the computing device 109. Referring to the example of FIG. 17, the first trained machine learning model may not only identify the remote capture environment to select a second trained machine learning model for the image processing and data extraction, it may also instruct the computing device to adjust its image capture settings for a next image capture session, after the current one, based on the current determined image capture environment. As one example, the machine learning model may prepare the operating settings of the computing device 109 for a low light or other determined environmental condition based on the currently detected environmental conditions. The setting that may be automatically changed by the model may include the type of light capture. In one embodiment, the machine learning model may instruct the device to capture the check image in infrared light, or to change a charge coupled device (CCD) setting of a sensor in the image capture device associated with the computing device. In alternative embodiments, the machine learning model can not only automatically adjust current operating setting of the image capture device of the computing device, it may also change a default operating setting for a similar environmental setting (e.g. a low light environment) for a future image capture session.


In embodiments where the image capture device consists of a camera with multiple lenses or multiple cameras as noted previously, the processor of the user's computing device 109 may execute downloaded instructions to assist with check edge detection/corner identification, via a trained machine learning model or a discrete algorithm, by adding in the factor of depth perception available from simultaneously captured images from the multiple cameras that are separately positioned as the image capture device.


The use of multiple camera lenses/multiple cameras as the image capture device on a single computing device 109 can not only increase the detail provided on the checks themselves to improve the processing and accuracy of data pulled from the images, but can enhance security. In one implementation, the computing device 109 can use different information acquired by each lens of the same check to better identify/authenticate the user of the user device. As one example, the rotational angle and distance of each lens to the check during image capture of the check maybe used to determine a more precise angle of the user device to the check. This information may then, in turn be used in a deterministic algorithm or a machine learning model to compare to prior measurements of user device angle of rotation and distance from a check typically measured for that user. The multiple camera lens image capture device information may be used in combination with, or separately from, other user identifiable information entered into or gathered by the computing device. User entered data may include name, password, biometric, token or other direct verification information. This information may be combined with Global Positioning System (GPS) location information and sensed user information, such as how the computing device is being held via the multiple camera lenses, to modify a confidence level that the user is who he or she claims to be.


Deposit suspension actions, or additional verification tasks, may be triggered at the computing device when one of more of the verification criteria do not match expectations of the system for a given user. If one of more the rotational angle or distance measurements do not result in the computing device being held in a manner expected for that particular user, then an additional authentication step may be triggered at the computing device 109 and any further deposit processing stopped. For example, the processing circuitry may execute an application running on the computing device 109 that suspends check processing for that image until an additional authentication step (e.g. user is presented with an additional authentication challenge question, or an additional biometric assessment or measurement is requested) is successfully completed. Alternatively, a machine learning user verification model may be implemented that has been trained to assess all of the available user-entered and device-sensed parameters and automatically allow a deposit transaction to proceed when enough of the parameters provide a threshold confidence level of user authenticity.


The machine learning model may implement different levels of verification according to predetermined transaction risk levels. For example, the computing device may only utilize the rotational angle and/or distance measurements from each lens to recognize a particular user's usage pattern of the computing device when an amount of the check deposit is above a minimum amount. In this manner higher fraud risk transactions will trigger the user device usage pattern measurement and lower fraud risk (e.g. lower transaction amounts) will not trigger the elevated user verification mechanism. Computing device 109 processing resources and power usage may thus be reduced except for when higher risk transactions are involved.


Other implementations of a multiple camera lens image capture device may include examining details of the check the user plans to deposit to verify user-specific features. In this embodiment, the user-specific aspects of the check to be deposited may can be relied on alone, or in combination with other user verification methods, such as the device orientation pattern noted above, to verify a user or strengthen the collection of user verification parameters used to verify the user of the computing device. Rather than utilizing the various different perspectives of the multiple lenses to identify a way a user is holding the user device, the multiple camera lenses can take a more critical look at the check to be deposited to identify bends, creases, folds or tears in the check that match a pattern of those parameters found in prior checks deposited by that user. As one example, a machine learning model may be trained from past deposits from the user to look for a location of creases in a check representing a fold pattern in checks that the particular user puts in checks he or she deposits (e.g. from a habit of folding a check in a certain way, placing it in a wallet, and then later unfolding it for a remote deposit operation). As another example of check condition parameters that may be used to identify the user via a trained machine learning model, the multiple lens image capture device may be used to acquire information on the depth of the indentations made by the user's signature endorsing the check, the color and content of the ink used in the signature, the positioning of the endorsement signature and the endorse, endorsement signature itself. One or more of these pieces of information may be applied to a trained machine learning that will either permit a deposit transaction to move forward, or will interrupt and pause the deposit process for additionally authentication instructions to be displayed to the user and acted on before further processing is permitted.


In addition to the computing device usage and distinguishing check characteristics that may be tracked to determine or verify the user of the computing device, the background surrounding the check in the captured image may be used to verify authenticity of a user and raise the confidence level that a transaction is not fraudulent. Assuming a user typically deposits checks from a limited number of locations (e.g. kitchen counter or desk at home), then a same background or set of different backgrounds surrounding the check can be expected for deposits from the user. In one implementation, the image processing engine 120 of the computing device 109 may include in its machine learning model repository 129, or download from a remote server, a trained machine learning user verification model that identifies one or more of the device usage patterns, check feature patterns or background patterns of image capture transactions for a particular user. With respect to use of background patterns to verify user identity, a multi-camera lens image capture device may not only capture the overall pattern of the background, but may determine texture (e.g., wood grain, surface roughness or carpet fiber texture) and finer light quality details (content of light spectrum of lighting in area where image is being captured, amount or direction of types of lighting that may be analyzed in the trained machine learning model to verify a user or at least a trusted location that is being used for the user's deposit activities. This background determination, via a trained machine learning model established from that user's past deposits, may provide another layer of verification that the deposit transaction is from a legitimate source.


The features of how a computing device is being held or operated, where the computing device is located, and of the user treatment of the check to be deposited, may be the focus of separate of combined trained machine learning models that execute on the computing device or remotely via data provided by the computing device. Each user may have a separate account stored locally on the user computing device, at a financial institution server, or in cloud storage that stores previously captured checks and images from that user that is used to build a history for that user and that may be used to train a machine learning/artificial intelligence-type model to profile the user for authentication and fraud prevention in future remote deposit transactions.


In alternative embodiments, the computing device may also use the trained machine learning model(s) related to user verification to switch transaction modes from a check deposit transaction to a different transfer format. In one implementation, the computing device may search for the availability of another funds transfer mechanism to substitute for the more rigorous check deposit process. As one example, a person-to-person (P2P) transaction that bypasses typical check clearance and other check deposit procedures may be automatically investigated by the computing device. For example, if enough of the authentication/verification parameters have been satisfied in a predetermined number of transactions for the user, the computing device may automatically use the data gathered from the current check image to trigger a P2P transaction from the payor bank to the payee bank rather than proceed with a check deposit transaction. In this embodiment, the current image captured by the computing device may use the heightened authenticity scrutiny available from processing the check image captured (e.g. via the multiple camera lens image capture device) through the one or more machine learning models trained with user specific data on device usage, check features and/or background parameters. If the current image meets the authenticity and user verification threshold, and the user has met a threshold number of prior check deposit transactions also satisfying the authenticity and user verification tests, then the computing device may either automatically contact the payor and payee banks to initiate a P2P transaction, or it may interrupt the current check deposit operation to query the user, via the display of the computing device, regarding whether to proceed with a check deposit or to instead switch the transaction to a P2P transaction.


It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium. When the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or use the processes described in connection with the presently disclosed subject matter, e.g., through the use of an API, reusable controls, or the like. Such programs may be implemented in a high-level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.


Although exemplary embodiments may refer to using aspects of the presently disclosed subject matter in the context of one or more stand-alone computer systems, the subject matter is not so limited, but rather may be implemented in connection with any computing environment, such as a network or distributed computing environment. Still further, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may similarly be spread across a plurality of devices. Such devices might include personal computers, network servers, and handheld devices, for example.


Various implementations have been specifically described. However, other implementations that include a fewer, or greater, number of features for each of the apparatuses, methods, or other embodiments described herein are also possible.

Claims
  • 1. A portable computing device, comprising: a processor;memory; andinstructions stored on the computer-readable medium, the instructions configured to, when executed by the processor, cause the processor to: capture a digital check image depicting a check;input the data from the captured digital check image into a trained machine learning image cropping model to determine a plurality of edge coordinates corresponding to the check depicted in the digital check image;detect a check image included in the digital check image based on the plurality of edge coordinates;generate a digital image file including the digital check image and the edge coordinates; andtransmit the digital image file to a remote server for deposit.
  • 2. The computing device of claim 1, wherein the instructions are configured to, when executed by the processor, further cause the processor to: remove the background image from the digital check image by cropping the digital check image based on the edge coordinates determined by the trained machine learning model.
  • 3. The computing device of claim 1, wherein the instructions are configured to, when executed by the processor, further cause the processor to generate the digital image file to include check data received via user input.
  • 4. The computing device of claim 1, wherein the instructions are configured to, when executed by the processor, further cause the processor to: input the data from the captured digital check image into a trained machine learning image denoising model;remove handwriting from printed text on the check depicted in the digital check image based on an output of the trained machine learning image denoising model.
  • 5. The computing device of claim 1, wherein the instructions are configured to, when executed by the processor, further cause the processor to: input the data from the captured digital check image into a trained machine learning image dewarping model;obtain a coordinate conversion map for dewarping the check depicted in the digital check image based on an output of the trained machine learning image dewarping model.
  • 6. The computing device of claim 5, wherein the instructions are configured to, when executed by the processor, further cause the processor to: input the data from the captured digital check image into a trained machine learning duplicate model, wherein the trained machine learning duplicate check model comprises a perceptual hash trained model;when the trained machine learning duplicate model is stored on the portable computing device, prevent transmission of the data from the captured check to the remote server in response to a determination by the trained machine learning duplicate model that the check depicted in the digital check image based is a duplicate check.
  • 7. The computing device of claim 4, wherein the instructions are configured to, when executed by the processor, further cause the processor to: input the data from the captured digital check image, after removing the handwriting from the printed text based on an output of the trained machine learning image denoising model, into a trained machine learning image dewarping model;obtain a coordinate conversion map for dewarping the check depicted in the digital check image based on an output of the trained machine learning image dewarping model.
  • 8. A portable computing device, comprising: a processor;a memory; andwherein the processor is configured to: download an app for performing image processing on, and remotely depositing to an institution, a digital check image of a check, wherein the downloaded app comprises a trained machine learning model for executing a first image processing task:control, using the downloaded app, an image capture device associated with the portable computing device to capture the digital check image;input, using the downloaded app, the captured digital check image into the trained machine learning model to modify the digital check image in accordance with the first image processing task;process the modified digital check image, using the downloaded app, with a deterministic algorithm to execute a second image processing task that is different from the first image processing task;transmit the modified and processed digital check file to a remote server for deposit.
  • 9. The portable computing device of claim 8, wherein the trained machine learning model for executing the first image processing task comprises an image cropping model configured to cause the processor of the portable computing device to remove a background image from the digital check image by cropping the digital check image based on edge coordinates determined by the trained machine learning model.
  • 10. The portable computing device of claim 8, wherein the trained machine learning model for executing the first image processing task comprises an image demorphing model configured to cause the processor of the portable computing device to remove a skew from the digital check image.
  • 11. The portable computing device of claim 10, wherein the image demorphing model causes the processor to generate a spatial transformation matrix to apply to the digital check image.
  • 12. The portable computing device of claim 8, wherein the trained machine learning model for executing the first image processing task comprises a denoising model configured to cause the processor of the portable computing device to remove handwritten marks from overlapping printed text in the digital check image.
  • 13. The portable computing device of claim 8, wherein the portable computing device comprises a smart phone and image capture device associated with the portable computing device comprises a digital camera.
  • 14. The portable computing device of claim 8, wherein the downloaded app comprises a plurality of trained machine learning models and the processor is configured by the downloaded app to select between two or more trained machine learning models for use in a particular image processing task after determining a state of the captured digital check image.
  • 15. A portable computing device, comprising: a processor;a memory; andwherein the processor is configured to: download an app for performing image processing on, and remotely depositing to an institution, a digital check image of a check, wherein the downloaded app comprises a first trained machine learning model;control, using the downloaded app, an image capture device associated with the portable computing device to capture the digital check image;input, using the downloaded app, the captured digital check image into the first trained machine learning model to identify which of a plurality of second trained machine learning models is a best match for handling an image processing task; andbased on the determined best match for handling the image processing task, select and execute a second trained machine learning model to modify the captured digital check image.
  • 16. The portable computing device of claim 15, wherein the first trained machine learning model is trained to cause the processor to: recognize one of a specific check type or check design type of the captured digital check image; andselect one of the plurality of second trained machine learning models that is a best match for handling the image processing task based on the check type or check design type.
  • 17. The portable computing device of claim 16, wherein the first trained machine learning model is further trained to cause the processor to: recognize at least one environmental factor associated with the captured digital check image, wherein the environmental factor comprises one of a lighting or shadow detection in the captured digital check image; andcause the processor to select the one of the plurality of second trained machine learning models based on the recognized environmental factor.
  • 18. The portable computing device of claim 16, wherein the processor is configured to, when the second trained machine learning model determined as the best match for handling the image processing task is not stored in the memory on the portable computing device, download the second trained machine learning model from a remote computer.
  • 19. The portable computing device of claim 15, wherein the first trained machine learning model is further configured to cause the processor to alter a device setting of the portable computing device for an capture operation based on a currently detected environmental condition.
  • 20. The portable computing device of claim 15, wherein the first trained machine learning model is further configured to cause the processor to alter a default device setting of the portable computing device for a future image capture operation after a current image capture operation based on a currently detected environmental condition.
US Referenced Citations (1322)
Number Name Date Kind
1748489 McCarthy et al. Feb 1930 A
2292825 Dilks et al. Aug 1942 A
3005282 Christiansen Oct 1961 A
3341820 Grillmeier, Jr. et al. Sep 1967 A
3576972 Wood May 1971 A
3593913 Bremer Jul 1971 A
3620553 Donovan Nov 1971 A
3648242 Grosbard Mar 1972 A
3800124 Walsh Mar 1974 A
3816943 Henry Jun 1974 A
4002356 Weidmann Jan 1977 A
4027142 Paup et al. May 1977 A
4060711 Buros Nov 1977 A
4070649 Wright, Jr. et al. Jan 1978 A
4128202 Buros Dec 1978 A
4136471 Austin Jan 1979 A
4205780 Burns Jun 1980 A
4264808 Owens Apr 1981 A
4305216 Skelton Dec 1981 A
4321672 Braun Mar 1982 A
4346442 Musmanno Aug 1982 A
4417136 Rushby et al. Nov 1983 A
4433436 Carnes Feb 1984 A
4454610 Sziklai Jun 1984 A
RE31692 Tyburski et al. Oct 1984 E
4523330 Cain Jun 1985 A
4636099 Goldston Jan 1987 A
4640413 Kaplan Feb 1987 A
4644144 Chandek Feb 1987 A
4722444 Murphy et al. Feb 1988 A
4722544 Weber Feb 1988 A
4727435 Otani Feb 1988 A
4737911 Freeman Apr 1988 A
4739411 Bolton Apr 1988 A
4774574 Daly et al. Sep 1988 A
4774663 Musmanno Sep 1988 A
4790475 Griffin Dec 1988 A
4806780 Yamamoto Feb 1989 A
4837693 Schotz Jun 1989 A
4890228 Longfield Dec 1989 A
4896363 Taylor et al. Jan 1990 A
4927071 Wood May 1990 A
4934587 McNabb Jun 1990 A
4960981 Benton Oct 1990 A
4975735 Bright Dec 1990 A
5022683 Barbour Jun 1991 A
5053607 Carlson Oct 1991 A
5077805 Tan Dec 1991 A
5091968 Higgins et al. Feb 1992 A
5122950 Benton et al. Jun 1992 A
5134564 Dunn et al. Jul 1992 A
5146606 Grondalski Sep 1992 A
5157620 Shaar Oct 1992 A
5159548 Caslavka Oct 1992 A
5164833 Aoki Nov 1992 A
5175682 Higashiyama et al. Dec 1992 A
5187750 Behera Feb 1993 A
5191525 LeBrun Mar 1993 A
5193121 Elischer et al. Mar 1993 A
5220501 Lawlor Jun 1993 A
5227863 Bilbrey et al. Jul 1993 A
5229589 Schneider Jul 1993 A
5233547 Kapp et al. Aug 1993 A
5237158 Kern et al. Aug 1993 A
5237159 Stephens Aug 1993 A
5237620 Deaton et al. Aug 1993 A
5257320 Etherington et al. Oct 1993 A
5265008 Benton Nov 1993 A
5268968 Yoshida Dec 1993 A
5283829 Anderson Feb 1994 A
5321816 Rogan Jun 1994 A
5345090 Hludzinski Sep 1994 A
5347302 Simonoff Sep 1994 A
5350906 Brody Sep 1994 A
5373550 Campbell Dec 1994 A
5383113 Kight et al. Jan 1995 A
5419588 Wood May 1995 A
5422467 Graef Jun 1995 A
5444616 Nair et al. Aug 1995 A
5444794 Uhland, Sr. Aug 1995 A
5455875 Chevion et al. Oct 1995 A
5475403 Havlovick et al. Dec 1995 A
5504538 Tsujihara Apr 1996 A
5504677 Pollin Apr 1996 A
5528387 Kelly et al. Jun 1996 A
5530773 Thompson Jun 1996 A
5577179 Blank Nov 1996 A
5583759 Geer Dec 1996 A
5590196 Moreau Dec 1996 A
5594225 Botvin Jan 1997 A
5598969 Ong Feb 1997 A
5602936 Green Feb 1997 A
5610726 Nonoshita Mar 1997 A
5611028 Shibasaki Mar 1997 A
5630073 Nolan May 1997 A
5631984 Graf et al. May 1997 A
5664027 Ittner Sep 1997 A
5668897 Stolfo Sep 1997 A
5673320 Ray et al. Sep 1997 A
5677955 Doggett Oct 1997 A
5678046 Cahill et al. Oct 1997 A
5679938 Templeton Oct 1997 A
5680611 Rail Oct 1997 A
5691524 Josephson Nov 1997 A
5699452 Vaidyanathan Dec 1997 A
5734747 Vaidyanathan Mar 1998 A
5737440 Kunkler Apr 1998 A
5748780 Stolfo May 1998 A
5751842 Riach May 1998 A
5761686 Bloomberg Jun 1998 A
5784503 Bleecker, III et al. Jul 1998 A
5830609 Warner Nov 1998 A
5832463 Funk Nov 1998 A
5838814 Moore Nov 1998 A
5848185 Koga et al. Dec 1998 A
5859935 Johnson et al. Jan 1999 A
5863075 Rich Jan 1999 A
5870456 Rogers Feb 1999 A
5870724 Lawlor Feb 1999 A
5870725 Bellinger et al. Feb 1999 A
5878337 Joao Mar 1999 A
5889884 Hashimoto et al. Mar 1999 A
5890141 Carney et al. Mar 1999 A
5893101 Balogh et al. Apr 1999 A
5897625 Gustin Apr 1999 A
5898157 Mangili et al. Apr 1999 A
5901253 Tretter May 1999 A
5903878 Talati May 1999 A
5903881 Schrader May 1999 A
5903904 Peairs May 1999 A
5910988 Ballard Jun 1999 A
5917931 Kunkler Jun 1999 A
5924737 Schrupp Jul 1999 A
5926548 Okamoto Jul 1999 A
5930501 Neil Jul 1999 A
5930778 Geer Jul 1999 A
5937396 Konya Aug 1999 A
5940844 Cahill Aug 1999 A
5982918 Mennie Nov 1999 A
5987439 Gustin et al. Nov 1999 A
6005623 Takahashi Dec 1999 A
6012048 Gustin et al. Jan 2000 A
6014454 Kunkler Jan 2000 A
6021202 Anderson Feb 2000 A
6021397 Jones Feb 2000 A
6023705 Bellinger et al. Feb 2000 A
6029887 Furuhashi Feb 2000 A
6030000 Diamond Feb 2000 A
6032137 Ballard Feb 2000 A
6038553 Hyde Mar 2000 A
6053405 Irwin, Jr. et al. Apr 2000 A
6059185 Funk et al. May 2000 A
6064753 Bolle et al. May 2000 A
6064762 Haenel May 2000 A
6072941 Suzuki et al. Jun 2000 A
6073119 Borenmisza-Wahr Jun 2000 A
6073121 Ramzy Jun 2000 A
6085168 Mori Jul 2000 A
6086708 Colgate Jul 2000 A
6089450 Koeple Jul 2000 A
6089610 Greene Jul 2000 A
6092047 Hyman et al. Jul 2000 A
6097834 Krouse Aug 2000 A
6097845 Ng et al. Aug 2000 A
6097885 Rayner Aug 2000 A
6105865 Hardesty Aug 2000 A
6128603 Dent et al. Oct 2000 A
6141339 Kaplan et al. Oct 2000 A
6145738 Stinson et al. Nov 2000 A
6148102 Stolin Nov 2000 A
6149056 Stinson et al. Nov 2000 A
6151409 Chen et al. Nov 2000 A
6151423 Melen Nov 2000 A
6151426 Lee Nov 2000 A
6159585 Rittenhouse Dec 2000 A
6170744 Lee Jan 2001 B1
6178270 Taylor et al. Jan 2001 B1
6178409 Weber et al. Jan 2001 B1
6181837 Cahill et al. Jan 2001 B1
6188506 Kaiserman Feb 2001 B1
6189785 Lowery Feb 2001 B1
6192165 Irons Feb 2001 B1
6195452 Royer Feb 2001 B1
6195694 Chen et al. Feb 2001 B1
6199055 Kara Mar 2001 B1
6236009 Emigh et al. May 2001 B1
6243689 Norton Jun 2001 B1
6278983 Ball Aug 2001 B1
6282523 Tedesco et al. Aug 2001 B1
6282826 Richards Sep 2001 B1
6289178 Kazami Sep 2001 B1
6293469 Masson et al. Sep 2001 B1
6304860 Martin Oct 2001 B1
6310647 Parulski et al. Oct 2001 B1
6314452 Dekel Nov 2001 B1
6315195 Ramachandrun Nov 2001 B1
6317727 May Nov 2001 B1
6328207 Gregoire et al. Dec 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6339658 Moccagatta Jan 2002 B1
6339766 Gephart Jan 2002 B1
6351553 Hayosh Feb 2002 B1
6351735 Deaton et al. Feb 2002 B1
6354490 Weiss et al. Mar 2002 B1
6363162 Moed et al. Mar 2002 B1
6363164 Jones et al. Mar 2002 B1
6390362 Martin May 2002 B1
6397196 Kravetz May 2002 B1
6408084 Foley Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6411737 Wesolkowski et al. Jun 2002 B2
6411938 Gates et al. Jun 2002 B1
6413305 Mehta Jul 2002 B1
6417869 Do Jul 2002 B1
6425017 Dievendorff Jul 2002 B1
6429952 Olbricht Aug 2002 B1
6439454 Masson et al. Aug 2002 B1
6449397 Che-Chu Sep 2002 B1
6450403 Martens et al. Sep 2002 B1
6463220 Dance et al. Oct 2002 B1
6464134 Page Oct 2002 B1
6469745 Yamada et al. Oct 2002 B1
6470325 Leemhuis Oct 2002 B1
6473519 Pidhirny et al. Oct 2002 B1
6502747 Stoutenburg et al. Jan 2003 B1
6505178 Flenley Jan 2003 B1
6546119 Ciolli et al. Apr 2003 B2
6564380 Murphy May 2003 B1
6574377 Cahill et al. Jun 2003 B1
6574609 Downs Jun 2003 B1
6578760 Otto Jun 2003 B1
6587837 Spagna Jul 2003 B1
6606117 Windle Aug 2003 B1
6609200 Anderson Aug 2003 B2
6611598 Hayosh Aug 2003 B1
6614930 Agnihotri et al. Sep 2003 B1
6643416 Daniels Nov 2003 B1
6647136 Jones et al. Nov 2003 B2
6654487 Downs, Jr. Nov 2003 B1
6661910 Jones et al. Dec 2003 B2
6668372 Wu Dec 2003 B1
6669086 Abdi et al. Dec 2003 B2
6672452 Alves Jan 2004 B1
6682452 Quintus Jan 2004 B2
6695204 Stinson Feb 2004 B1
6697091 Rzepkowski et al. Feb 2004 B1
6704039 Pena Mar 2004 B2
6711474 Treyz et al. Mar 2004 B1
6726097 Graef Apr 2004 B2
6728397 McNeal Apr 2004 B2
6738087 Belkin et al. May 2004 B2
6738496 Van Hall May 2004 B1
6742128 Joiner May 2004 B1
6745186 Testa et al. Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6755340 Voss et al. Jun 2004 B1
6760414 Schurko et al. Jul 2004 B1
6760470 Bogosian et al. Jul 2004 B1
6763226 McZeal Jul 2004 B1
6781962 Williams Aug 2004 B1
6786398 Stinson et al. Sep 2004 B1
6789054 Makhlouf Sep 2004 B1
6796489 Slater et al. Sep 2004 B2
6796491 Nakajima Sep 2004 B2
6806903 Okisu et al. Oct 2004 B1
6807294 Yamazaki Oct 2004 B2
6813733 Li Nov 2004 B1
6829704 Zhang Dec 2004 B2
6844885 Anderson Jan 2005 B2
6856965 Stinson Feb 2005 B1
6863214 Garner et al. Mar 2005 B2
6870947 Kelland Mar 2005 B2
6873728 Bernstein et al. Mar 2005 B2
6883140 Acker Apr 2005 B1
6898314 Kung et al. May 2005 B2
6902105 Koakutsu Jun 2005 B2
6910023 Schibi Jun 2005 B1
6913188 Wong Jul 2005 B2
6922487 Dance et al. Jul 2005 B2
6930718 Parulski et al. Aug 2005 B2
6931255 Mekuria Aug 2005 B2
6931591 Brown Aug 2005 B1
6934719 Nally Aug 2005 B2
6044773 Abrahams Sep 2005 A1
6944773 Abrahams Sep 2005 B1
6947610 Sun Sep 2005 B2
6957770 Robinson Oct 2005 B1
6961689 Greenberg Nov 2005 B1
6970843 Forte Nov 2005 B1
6973589 Wright Dec 2005 B2
6983886 Natsukari et al. Jan 2006 B2
6993507 Meyer Jan 2006 B2
6996263 Jones et al. Feb 2006 B2
6999943 Johnson Feb 2006 B1
7003040 Yi Feb 2006 B2
7004382 Sandru Feb 2006 B2
7010155 Koakutsu et al. Mar 2006 B2
7010507 Anderson Mar 2006 B1
7016704 Pallakoff Mar 2006 B2
7027171 Watanabe Apr 2006 B1
7028886 Maloney Apr 2006 B1
7039048 Monta May 2006 B1
7046991 Little May 2006 B2
7051001 Slater May 2006 B1
7058036 Yu Jun 2006 B1
7062099 Li et al. Jun 2006 B2
7062456 Riehl et al. Jun 2006 B1
7062768 Kubo Jun 2006 B2
7072862 Wilson Jul 2006 B1
7076458 Lawlor et al. Jul 2006 B2
7086003 Demsky Aug 2006 B2
7092561 Downs, Jr. Aug 2006 B2
7104443 Paul et al. Sep 2006 B1
7113925 Waserstein Sep 2006 B2
7114649 Nelson Oct 2006 B2
7116446 Maurer Oct 2006 B2
7117171 Pollin Oct 2006 B1
7120461 Cho Oct 2006 B2
7131571 Swift et al. Nov 2006 B2
7139594 Nagatomo Nov 2006 B2
7140539 Crews Nov 2006 B1
7163347 Lugg Jan 2007 B2
7178721 Maloney Feb 2007 B2
7181430 Buchanan et al. Feb 2007 B1
7184980 Allen-Rouman et al. Feb 2007 B2
7185805 McShirley Mar 2007 B1
7197173 Jones et al. Mar 2007 B2
7200255 Jones Apr 2007 B2
7204412 Foss, Jr. Apr 2007 B2
7207478 Blackson et al. Apr 2007 B1
7216106 Buchanan May 2007 B1
7219082 Forte May 2007 B2
7219831 Murata May 2007 B2
7240336 Baker Jul 2007 B1
7245765 Myers et al. Jul 2007 B2
7249076 Pendleton Jul 2007 B1
7252224 Verma Aug 2007 B2
7257246 Brodie et al. Aug 2007 B1
7266230 Doran Sep 2007 B2
7277191 Metcalfe et al. Oct 2007 B2
7290034 Budd Oct 2007 B2
7296734 Pliha Nov 2007 B2
7299970 Ching Nov 2007 B1
7299979 Phillips Nov 2007 B2
7313543 Crane Dec 2007 B1
7314163 Crews et al. Jan 2008 B1
7321874 Dilip Jan 2008 B2
7321875 Dilip Jan 2008 B2
7325725 Foss, Jr. Feb 2008 B2
7328190 Smith et al. Feb 2008 B2
7330604 Wu et al. Feb 2008 B2
7331523 Meier et al. Feb 2008 B2
7336813 Prakash et al. Feb 2008 B2
7343320 Treyz Mar 2008 B1
7349566 Jones et al. Mar 2008 B2
7349585 Li Mar 2008 B2
7350697 Swift et al. Apr 2008 B2
7356505 March Apr 2008 B2
7369713 Suino May 2008 B2
7377425 Ma May 2008 B1
7379978 Anderson May 2008 B2
7383227 Weinflash et al. Jun 2008 B2
7385631 Maeno Jun 2008 B2
7386511 Buchanan Jun 2008 B2
7388683 Rodriguez et al. Jun 2008 B2
7389912 Starrs Jun 2008 B2
7391897 Jones et al. Jun 2008 B2
7391934 Goodall et al. Jun 2008 B2
7392935 Byrne Jul 2008 B2
7401048 Rosedale Jul 2008 B2
7403917 Larsen Jul 2008 B1
7406198 Aoki et al. Jul 2008 B2
7419093 Blackson et al. Sep 2008 B1
7421107 Lugg Sep 2008 B2
7421410 Schechtman et al. Sep 2008 B1
7427016 Chimento Sep 2008 B2
7433098 Klein et al. Oct 2008 B2
7437327 Lam Oct 2008 B2
7440924 Buchanan Oct 2008 B2
7447347 Weber Nov 2008 B2
7455220 Phillips Nov 2008 B2
7455221 Sheaffer Nov 2008 B2
7460108 Tamura Dec 2008 B2
7460700 Tsunachima et al. Dec 2008 B2
7461779 Ramachandran Dec 2008 B2
7461780 Potts Dec 2008 B2
7464859 Hawkins Dec 2008 B1
7471818 Price Dec 2008 B1
7475040 Buchanan Jan 2009 B2
7477923 Wallmark Jan 2009 B2
7480382 Dunbar Jan 2009 B2
7480422 Ackley et al. Jan 2009 B2
7489953 Griffin Feb 2009 B2
7490242 Torres Feb 2009 B2
7497429 Reynders Mar 2009 B2
7503486 Ahles Mar 2009 B2
7505759 Rahman Mar 2009 B1
7506261 Statou Mar 2009 B2
7509287 Nutahara Mar 2009 B2
7512564 Geer Mar 2009 B1
7519560 Lam Apr 2009 B2
7520420 Phillips Apr 2009 B2
7520422 Robinson et al. Apr 2009 B1
7536354 deGroeve et al. May 2009 B1
7536440 Budd May 2009 B2
7539646 Gilder May 2009 B2
7540408 Levine Jun 2009 B2
7542598 Jones Jun 2009 B2
7545529 Borrey et al. Jun 2009 B2
7548641 Gilson et al. Jun 2009 B2
7566002 Love et al. Jul 2009 B2
7568615 Corona et al. Aug 2009 B2
7571848 Cohen Aug 2009 B2
7577614 Warren et al. Aug 2009 B1
7587066 Cordery et al. Sep 2009 B2
7587363 Cataline Sep 2009 B2
7590275 Clarke et al. Sep 2009 B2
7599543 Jones Oct 2009 B2
7599888 Manfre Oct 2009 B2
7602956 Jones Oct 2009 B2
7606762 Heit Oct 2009 B1
7609873 Foth et al. Oct 2009 B2
7609889 Guo et al. Oct 2009 B2
7619721 Jones Nov 2009 B2
7620231 Jones Nov 2009 B2
7620604 Bueche, Jr. Nov 2009 B1
7630518 Frew et al. Dec 2009 B2
7644037 Ostrovsky Jan 2010 B1
7644043 Minowa Jan 2010 B2
7647275 Jones Jan 2010 B2
7647897 Jones Jan 2010 B2
7668363 Price Feb 2010 B2
7672022 Fan Mar 2010 B1
7672940 Viola Mar 2010 B2
7676409 Ahmad Mar 2010 B1
7680732 Davies et al. Mar 2010 B1
7680735 Loy Mar 2010 B1
7689482 Lam Mar 2010 B2
7697776 Wu et al. Apr 2010 B2
7698222 Bueche, Jr. Apr 2010 B1
7702588 Gilder et al. Apr 2010 B2
7714778 Dupray May 2010 B2
7720735 Anderson et al. May 2010 B2
7734545 Fogliano Jun 2010 B1
7743979 Fredman Jun 2010 B2
7753268 Robinson et al. Jul 2010 B1
7761358 Craig et al. Jul 2010 B2
7766223 Mello Aug 2010 B1
7766244 Field Aug 2010 B1
7769650 Bleunven Aug 2010 B2
7778457 Nepomniachtchi et al. Aug 2010 B2
7792752 Kay Sep 2010 B1
7792753 Slater et al. Sep 2010 B1
7793833 Yoon et al. Sep 2010 B2
7810714 Murata Oct 2010 B2
7812986 Graham et al. Oct 2010 B2
7818245 Prakash et al. Oct 2010 B2
7831458 Neumann Nov 2010 B2
7856402 Kay Dec 2010 B1
7865384 Anderson et al. Jan 2011 B2
7865425 Waelbroeck Jan 2011 B2
7873200 Oakes, III et al. Jan 2011 B1
7873556 Dolan Jan 2011 B1
7876949 Oakes, III et al. Jan 2011 B1
7885451 Walls et al. Feb 2011 B1
7885880 Prasad et al. Feb 2011 B1
7894094 Nacman et al. Feb 2011 B2
7895054 Slen et al. Feb 2011 B2
7896232 Prasad et al. Mar 2011 B1
7900822 Prasad et al. Mar 2011 B1
7903863 Jones et al. Mar 2011 B2
7904386 Kalra et al. Mar 2011 B2
7912785 Kay Mar 2011 B1
7935441 Tononishi May 2011 B2
7949587 Morris et al. May 2011 B1
7950698 Popadic et al. May 2011 B2
7953441 Lors May 2011 B2
7958053 Stone Jun 2011 B2
7962411 Prasad et al. Jun 2011 B1
7970677 Oakes, III et al. Jun 2011 B1
7974869 Sharma Jul 2011 B1
7974899 Prasad et al. Jul 2011 B1
7978900 Nepomniachtchi et al. Jul 2011 B2
7979326 Kurushima Jul 2011 B2
7987231 Karkanias Jul 2011 B2
7996312 Beck et al. Aug 2011 B1
7996314 Smith et al. Aug 2011 B1
7996315 Smith et al. Aug 2011 B1
7996316 Smith et al. Aug 2011 B1
8000514 Nepomniachtchi et al. Aug 2011 B2
8001051 Smith et al. Aug 2011 B1
8009931 Li Aug 2011 B2
8045784 Price et al. Oct 2011 B2
8046301 Smith et al. Oct 2011 B1
8051453 Arseneau et al. Nov 2011 B2
8060442 Hecht et al. Nov 2011 B1
8064729 Li Nov 2011 B2
8065307 Haslam et al. Nov 2011 B2
8091778 Block et al. Jan 2012 B1
8106956 Nikkanen Jan 2012 B2
8116533 Kiplinger et al. Feb 2012 B2
8118654 Nicolas Feb 2012 B1
8131636 Viera et al. Mar 2012 B1
8159520 Dhanoa Apr 2012 B1
8203640 Kim et al. Jun 2012 B2
8204293 Csulits et al. Jun 2012 B2
8235284 Prasad et al. Aug 2012 B1
8266076 Lopez et al. Sep 2012 B2
8271385 Emerson et al. Sep 2012 B2
8290237 Burks et al. Oct 2012 B1
8313020 Ramachandran Nov 2012 B2
8320657 Burks et al. Nov 2012 B1
8332329 Thiele Dec 2012 B1
8341077 Nichols et al. Dec 2012 B1
8351677 Oakes, III et al. Jan 2013 B1
8351678 Medina, III Jan 2013 B1
8358826 Medina et al. Jan 2013 B1
8364563 Choiniere, Sr. Jan 2013 B2
8369650 Zamfir et al. Feb 2013 B2
8374963 Billman Feb 2013 B1
8391599 Medina, III Mar 2013 B1
8392332 Oakes et al. Mar 2013 B1
8396623 Maeda et al. Mar 2013 B2
8401962 Bent et al. Mar 2013 B1
8422758 Bueche, Jr. Apr 2013 B1
8433127 Harpel et al. Apr 2013 B1
8433647 Yarbrough Apr 2013 B1
8452689 Medina, III May 2013 B1
RE44274 Popadic et al. Jun 2013 E
8464933 Prasad et al. Jun 2013 B1
8483473 Roach Jul 2013 B2
8531518 Zomet Sep 2013 B1
8538124 Harpel et al. Sep 2013 B1
8542921 Medina Sep 2013 B1
8548267 Yacoub et al. Oct 2013 B1
8559766 Tilt et al. Oct 2013 B2
8582862 Nepomniachtchi et al. Nov 2013 B2
8611635 Medina, III Dec 2013 B1
8660952 Viera et al. Feb 2014 B1
8688579 Ethington et al. Apr 2014 B1
8699779 Prasad et al. Apr 2014 B1
8708227 Oakes, III et al. Apr 2014 B1
8731321 Fujiwara et al. May 2014 B2
8732081 Oakes, III et al. May 2014 B1
8751345 Borzych et al. Jun 2014 B1
8751356 Garcia Jun 2014 B1
8751379 Bueche, Jr. Jun 2014 B1
8768038 Sherman et al. Jul 2014 B1
8768836 Acharya Jul 2014 B1
8799147 Walls et al. Aug 2014 B1
8818033 Liu Aug 2014 B1
8824772 Viera Sep 2014 B2
8837806 Ethington et al. Sep 2014 B1
8843405 Hartman et al. Sep 2014 B1
8929640 Mennie et al. Jan 2015 B1
8959033 Oakes, III et al. Feb 2015 B1
8977571 Bueche, Jr. et al. Mar 2015 B1
8990862 Smith Mar 2015 B1
9009071 Watson et al. Apr 2015 B1
9036040 Danko May 2015 B1
9058512 Medina, III Jun 2015 B1
9064284 Janiszeski et al. Jun 2015 B1
9129340 Medina, III et al. Sep 2015 B1
9159101 Pollack et al. Oct 2015 B1
9177197 Prasad et al. Nov 2015 B1
9177198 Prasad et al. Nov 2015 B1
9195986 Christy et al. Nov 2015 B2
9224136 Oakes, III et al. Dec 2015 B1
9235860 Boucher et al. Jan 2016 B1
9270804 Dees et al. Feb 2016 B2
9286514 Newman Mar 2016 B1
9311634 Hildebrand Apr 2016 B1
9336517 Prasad et al. May 2016 B1
9384409 Ming Jul 2016 B1
9387813 Moeller et al. Jul 2016 B1
9390339 Danko Jul 2016 B1
9401011 Medina, III et al. Jul 2016 B2
9424569 Sherman et al. Aug 2016 B1
9524269 Brinkmann et al. Dec 2016 B1
9569756 Bueche, Jr. et al. Feb 2017 B1
9613467 Roberts et al. Apr 2017 B2
9613469 Fish et al. Apr 2017 B2
9619872 Medina, III et al. Apr 2017 B1
9626183 Smith et al. Apr 2017 B1
9626662 Prasad et al. Apr 2017 B1
9779392 Prasad et al. Oct 2017 B1
9779452 Medina et al. Oct 2017 B1
9785929 Watson et al. Oct 2017 B1
9792654 Limas et al. Oct 2017 B1
9818090 Bueche, Jr. et al. Nov 2017 B1
9824453 Collins Nov 2017 B1
9886642 Danko Feb 2018 B1
9892454 Pollack et al. Feb 2018 B1
9898778 Pollack et al. Feb 2018 B1
9898808 Medina, III et al. Feb 2018 B1
9904848 Newman Feb 2018 B1
9946923 Medina Apr 2018 B1
10013605 Oakes, III et al. Jul 2018 B1
10013681 Oakes, III et al. Jul 2018 B1
10157326 Long et al. Dec 2018 B2
10181087 Danko Jan 2019 B1
10210767 Johansen Feb 2019 B2
10217375 Waldron Feb 2019 B2
10235660 Bueche, Jr. et al. Mar 2019 B1
10325420 Moon Jun 2019 B1
10354235 Medina Jul 2019 B1
10360448 Newman Jul 2019 B1
10373136 Pollack et al. Aug 2019 B1
10380559 Oakes, III et al. Aug 2019 B1
10380562 Prasad et al. Aug 2019 B1
10380565 Prasad Aug 2019 B1
10380683 Voutour et al. Aug 2019 B1
10380993 Clauer Salyers Aug 2019 B1
10402638 Oakes, III et al. Sep 2019 B1
10402790 Clark et al. Sep 2019 B1
10402944 Pribble et al. Sep 2019 B1
10460295 Oakes, III et al. Oct 2019 B1
10460381 Pollack et al. Oct 2019 B1
10482432 Oakes, III et al. Nov 2019 B1
10504185 Buentello Dec 2019 B1
10521781 Singfield Dec 2019 B1
10552810 Ethington et al. Feb 2020 B1
10574879 Prasad et al. Feb 2020 B1
10621559 Oakes, III et al. Apr 2020 B1
10621660 Medina et al. Apr 2020 B1
10706466 Ethington et al. Jul 2020 B1
10713629 Medina, III Jul 2020 B1
10719815 Oakes, III et al. Jul 2020 B1
10769598 Oakes, III et al. Sep 2020 B1
10769603 Prasad Sep 2020 B1
10810561 Pollack et al. Oct 2020 B1
10818282 Clauer Salyers Oct 2020 B1
10839358 Prasad et al. Nov 2020 B1
10846667 Hecht Nov 2020 B1
10848665 Prasad et al. Nov 2020 B1
10855914 Prasad et al. Dec 2020 B1
10896408 Prasad et al. Jan 2021 B1
10915879 Pollack et al. Feb 2021 B1
10956728 Voutour Mar 2021 B1
10956879 Eidson et al. Mar 2021 B1
11030752 Backlund Jun 2021 B1
11042940 Limas Jun 2021 B1
11042941 Limas Jun 2021 B1
11062130 Medina, III Jul 2021 B1
11062131 Medina, III Jul 2021 B1
11062283 Prasad Jul 2021 B1
11064111 Prasad Jul 2021 B1
11068976 Voutour Jul 2021 B1
11070868 Mortensen Jul 2021 B1
11121989 Castinado Sep 2021 B1
11182753 Oakes, III et al. Nov 2021 B1
11222315 Prasad et al. Jan 2022 B1
11232517 Medina et al. Jan 2022 B1
11250398 Prasad et al. Feb 2022 B1
11288898 Moon Mar 2022 B1
11328267 Medina, III May 2022 B1
20010004235 Maloney Jun 2001 A1
20010014881 Drummond Aug 2001 A1
20010016084 Pollard et al. Aug 2001 A1
20010018739 Anderson Aug 2001 A1
20010020949 Gong et al. Sep 2001 A1
20010027994 Hayashida Oct 2001 A1
20010030695 Prabhu et al. Oct 2001 A1
20010037299 Nichols et al. Nov 2001 A1
20010042171 Vermeulen Nov 2001 A1
20010042785 Walker Nov 2001 A1
20010043748 Wesolkowski et al. Nov 2001 A1
20010047330 Gephart Nov 2001 A1
20010051965 Guillevic Dec 2001 A1
20010054020 Barth et al. Dec 2001 A1
20020001393 Jones Jan 2002 A1
20020013767 Katz Jan 2002 A1
20020016763 March Feb 2002 A1
20020016769 Barbara et al. Feb 2002 A1
20020023055 Antognini et al. Feb 2002 A1
20020025085 Gustafson et al. Feb 2002 A1
20020026418 Koppel et al. Feb 2002 A1
20020032656 Chen Mar 2002 A1
20020038289 Lawlor et al. Mar 2002 A1
20020040340 Yoshida Apr 2002 A1
20020052841 Guthrie May 2002 A1
20020052853 Munoz May 2002 A1
20020065786 Martens et al. May 2002 A1
20020072974 Pugliese Jun 2002 A1
20020075380 Seeger et al. Jun 2002 A1
20020075524 Blair Jun 2002 A1
20020084321 Martens Jul 2002 A1
20020087467 Mascavage, III et al. Jul 2002 A1
20020107767 McClair et al. Aug 2002 A1
20020107809 Biddle et al. Aug 2002 A1
20020116329 Serbetcioglu Aug 2002 A1
20020116335 Star Aug 2002 A1
20020118891 Rudd Aug 2002 A1
20020120562 Opiela Aug 2002 A1
20020120582 Elston et al. Aug 2002 A1
20020120846 Stewart et al. Aug 2002 A1
20020129249 Maillard et al. Sep 2002 A1
20020130868 Smith Sep 2002 A1
20020133409 Sawano et al. Sep 2002 A1
20020138445 Laage Sep 2002 A1
20020138522 Muralidhar Sep 2002 A1
20020145035 Jones Oct 2002 A1
20020147798 Huang Oct 2002 A1
20020150279 Scott Oct 2002 A1
20020150311 Lynn Oct 2002 A1
20020152160 Allen-Rouman et al. Oct 2002 A1
20020152161 Aoike Oct 2002 A1
20020152164 Dutta Oct 2002 A1
20020152165 Dutta et al. Oct 2002 A1
20020152169 Dutta et al. Oct 2002 A1
20020152170 Dutta et al. Oct 2002 A1
20020153414 Stoutenburg et al. Oct 2002 A1
20020154127 Vienneau et al. Oct 2002 A1
20020154815 Mizutani Oct 2002 A1
20020159648 Alderson et al. Oct 2002 A1
20020169715 Ruth et al. Nov 2002 A1
20020171820 Okamura Nov 2002 A1
20020172516 Aoyama Nov 2002 A1
20020178112 Goeller Nov 2002 A1
20020186881 Li Dec 2002 A1
20020188564 Star Dec 2002 A1
20020195485 Pomerleau et al. Dec 2002 A1
20030005326 Flemming Jan 2003 A1
20030009420 Jones Jan 2003 A1
20030015583 Abdi et al. Jan 2003 A1
20030018897 Bellis, Jr. et al. Jan 2003 A1
20030023557 Moore Jan 2003 A1
20030026609 Parulski Feb 2003 A1
20030038227 Sesek Feb 2003 A1
20030046223 Crawford Mar 2003 A1
20030050889 Burke Mar 2003 A1
20030051138 Maeda et al. Mar 2003 A1
20030053692 Hong et al. Mar 2003 A1
20030055756 Allan Mar 2003 A1
20030055776 Samuelson Mar 2003 A1
20030072568 Lin et al. Apr 2003 A1
20030074315 Lam Apr 2003 A1
20030075596 Koakutsu Apr 2003 A1
20030075916 Gorski Apr 2003 A1
20030078883 Stewart et al. Apr 2003 A1
20030081121 Swan May 2003 A1
20030081824 Mennie May 2003 A1
20030086615 Dance et al. May 2003 A1
20030093367 Allen-Rouman et al. May 2003 A1
20030093369 Ijichi et al. May 2003 A1
20030097592 Adusumilli May 2003 A1
20030102714 Rhodes et al. Jun 2003 A1
20030105688 Brown et al. Jun 2003 A1
20030105714 Alarcon-Luther et al. Jun 2003 A1
20030119478 Nagy et al. Jun 2003 A1
20030126078 Vihinen Jul 2003 A1
20030126082 Omura et al. Jul 2003 A1
20030130940 Hansen et al. Jul 2003 A1
20030130958 Narayanan et al. Jul 2003 A1
20030132384 Sugiyama et al. Jul 2003 A1
20030133608 Bernstein et al. Jul 2003 A1
20030133610 Nagarajan et al. Jul 2003 A1
20030135457 Stewart et al. Jul 2003 A1
20030139999 Rowe Jul 2003 A1
20030159046 Choi et al. Aug 2003 A1
20030167225 Adams Sep 2003 A1
20030177448 Levine et al. Sep 2003 A1
20030187790 Swift et al. Oct 2003 A1
20030191615 Bailey Oct 2003 A1
20030191869 Williams Oct 2003 A1
20030200107 Allen et al. Oct 2003 A1
20030200174 Star Oct 2003 A1
20030202690 Jones et al. Oct 2003 A1
20030212904 Randle et al. Nov 2003 A1
20030213841 Josephson et al. Nov 2003 A1
20030217005 Drummond et al. Nov 2003 A1
20030218061 Filatov Nov 2003 A1
20030225705 Park et al. Dec 2003 A1
20030231285 Ferguson Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20030233318 King et al. Dec 2003 A1
20040010466 Anderson Jan 2004 A1
20040010803 Berstis Jan 2004 A1
20040012496 De Souza Jan 2004 A1
20040012679 Fan Jan 2004 A1
20040013284 Yu Jan 2004 A1
20040017482 Weitman Jan 2004 A1
20040024626 Bruijning Feb 2004 A1
20040024708 Masuda Feb 2004 A1
20040029591 Chapman et al. Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040044606 Buttridge et al. Mar 2004 A1
20040057697 Renzi Mar 2004 A1
20040058705 Morgan Mar 2004 A1
20040061913 Takiguchi Apr 2004 A1
20040066031 Wong Apr 2004 A1
20040066419 Pyhalammi Apr 2004 A1
20040069841 Wong Apr 2004 A1
20040071333 Douglas et al. Apr 2004 A1
20040075754 Nakajima et al. Apr 2004 A1
20040076320 Downs, Jr. Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040080795 Bean et al. Apr 2004 A1
20040089711 Sandru May 2004 A1
20040093303 Picciallo May 2004 A1
20040093305 Kight May 2004 A1
20040103057 Melbert et al. May 2004 A1
20040103296 Harp May 2004 A1
20040109596 Doran Jun 2004 A1
20040110975 Osinski et al. Jun 2004 A1
20040111371 Friedman Jun 2004 A1
20040117302 Weichert Jun 2004 A1
20040122754 Stevens Jun 2004 A1
20040133511 Smith et al. Jul 2004 A1
20040133516 Buchanan et al. Jul 2004 A1
20040136586 Okamura Jul 2004 A1
20040138974 Shimamura Jul 2004 A1
20040148235 Craig et al. Jul 2004 A1
20040158549 Matena Aug 2004 A1
20040165096 Maeno Aug 2004 A1
20040170259 Park Sep 2004 A1
20040171371 Paul Sep 2004 A1
20040184766 Kim et al. Sep 2004 A1
20040193878 Dillinger et al. Sep 2004 A1
20040201695 Inasaka Oct 2004 A1
20040201741 Ban Oct 2004 A1
20040202349 Erol et al. Oct 2004 A1
20040205459 Green Oct 2004 A1
20040210515 Hughes Oct 2004 A1
20040210523 Gains et al. Oct 2004 A1
20040217170 Takiguchi et al. Nov 2004 A1
20040225604 Foss, Jr. et al. Nov 2004 A1
20040228277 Williams Nov 2004 A1
20040236647 Acharya Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040238619 Nagasaka et al. Dec 2004 A1
20040240722 Tsuji et al. Dec 2004 A1
20040245324 Chen Dec 2004 A1
20040247199 Murai et al. Dec 2004 A1
20040248600 Kim Dec 2004 A1
20040252679 Williams Dec 2004 A1
20040260636 Marceau Dec 2004 A1
20040267665 Nam et al. Dec 2004 A1
20040267666 Minami Dec 2004 A1
20050001421 Luth et al. Jan 2005 A1
20050001924 Honda Jan 2005 A1
20050010108 Rahn et al. Jan 2005 A1
20050015332 Chen Jan 2005 A1
20050015341 Jackson Jan 2005 A1
20050015342 Murata Jan 2005 A1
20050021466 Buchanan et al. Jan 2005 A1
20050030388 Stavely et al. Feb 2005 A1
20050033645 Duphily Feb 2005 A1
20050033685 Reyes Feb 2005 A1
20050033690 Antognini et al. Feb 2005 A1
20050033695 Minowa Feb 2005 A1
20050034046 Berkmann Feb 2005 A1
20050035193 Gustin et al. Feb 2005 A1
20050038746 Latimer et al. Feb 2005 A1
20050038754 Geist Feb 2005 A1
20050044042 Mendiola Feb 2005 A1
20050044577 Jerding Feb 2005 A1
20050049950 Johnson Mar 2005 A1
20050071283 Randle et al. Mar 2005 A1
20050075969 Nielson et al. Apr 2005 A1
20050075974 Turgeon Apr 2005 A1
20050077351 De Jong Apr 2005 A1
20050078192 Sakurai Apr 2005 A1
20050078336 Ferlitsch Apr 2005 A1
20050080725 Pick Apr 2005 A1
20050082364 Alvarez et al. Apr 2005 A1
20050086140 Ireland Apr 2005 A1
20050086168 Alvarez Apr 2005 A1
20050089209 Stefanuk Apr 2005 A1
20050091161 Gustin Apr 2005 A1
20050096992 Geisel May 2005 A1
20050097019 Jacobs May 2005 A1
20050097046 Singfield May 2005 A1
20050097050 Orcutt May 2005 A1
20050100216 Myers et al. May 2005 A1
20050102208 Gudgeon May 2005 A1
20050108164 Salafia May 2005 A1
20050108168 Halpin May 2005 A1
20050115110 Dinkins Jun 2005 A1
20050125338 Tidwell et al. Jun 2005 A1
20050125360 Tidwell et al. Jun 2005 A1
20050127160 Fujikawa Jun 2005 A1
20050128333 Park Jun 2005 A1
20050131820 Rodriguez Jun 2005 A1
20050133586 Rekeweg et al. Jun 2005 A1
20050143136 Lev et al. Jun 2005 A1
20050144131 Aziz Jun 2005 A1
20050149436 Elterich Jul 2005 A1
20050157174 Kitamura et al. Jul 2005 A1
20050165641 Chu Jul 2005 A1
20050168566 Tada Aug 2005 A1
20050171899 Dunn Aug 2005 A1
20050171907 Lewis Aug 2005 A1
20050177494 Kelly et al. Aug 2005 A1
20050177499 Thomas Aug 2005 A1
20050177510 Hilt et al. Aug 2005 A1
20050177518 Brown Aug 2005 A1
20050182710 Andersson et al. Aug 2005 A1
20050188306 Mackenzie Aug 2005 A1
20050190269 Grignani Sep 2005 A1
20050198364 David del Val et al. Sep 2005 A1
20050203430 Williams et al. Sep 2005 A1
20050205660 Munte Sep 2005 A1
20050205661 Taylor Sep 2005 A1
20050209961 Michelsen Sep 2005 A1
20050213805 Blake et al. Sep 2005 A1
20050216409 McMonagle et al. Sep 2005 A1
20050216410 Davis et al. Sep 2005 A1
20050218209 Heilper et al. Oct 2005 A1
20050220324 Klein et al. Oct 2005 A1
20050228733 Bent et al. Oct 2005 A1
20050238257 Kaneda et al. Oct 2005 A1
20050244035 Klein et al. Nov 2005 A1
20050252955 Sugai Nov 2005 A1
20050267843 Acharya et al. Dec 2005 A1
20050268107 Harris et al. Dec 2005 A1
20050269412 Chiu Dec 2005 A1
20050273368 Hutten et al. Dec 2005 A1
20050273430 Pliha Dec 2005 A1
20050278250 Zair Dec 2005 A1
20050281448 Lugg Dec 2005 A1
20050281450 Richardson Dec 2005 A1
20050281471 LeComte Dec 2005 A1
20050281474 Huang Dec 2005 A1
20050289030 Smith Dec 2005 A1
20050289059 Brewington et al. Dec 2005 A1
20050289182 Pandian et al. Dec 2005 A1
20060002426 Madour Jan 2006 A1
20060004660 Pranger Jan 2006 A1
20060015450 Guck et al. Jan 2006 A1
20060015733 O'Malley et al. Jan 2006 A1
20060017752 Kurzweil et al. Jan 2006 A1
20060025697 Kurzweil Feb 2006 A1
20060026140 King Feb 2006 A1
20060039628 Li et al. Feb 2006 A1
20060039629 Li et al. Feb 2006 A1
20060041506 Mason et al. Feb 2006 A1
20060171697 Nijima Feb 2006 A1
20060004537 Kim et al. Mar 2006 A1
20060045321 Yu Mar 2006 A1
20060045379 Heaney, Jr. et al. Mar 2006 A1
20060047593 Naratil Mar 2006 A1
20060049242 Mejias et al. Mar 2006 A1
20060053056 Alspach-Goss Mar 2006 A1
20060059085 Tucker Mar 2006 A1
20060064368 Forte Mar 2006 A1
20060071950 Kurzweil et al. Apr 2006 A1
20060077941 Alagappan et al. Apr 2006 A1
20060080245 Bahl Apr 2006 A1
20060085357 Pizarro Apr 2006 A1
20060085516 Farr et al. Apr 2006 A1
20060102704 Reynders May 2006 A1
20060103893 Azimi et al. May 2006 A1
20060106691 Sheaffer May 2006 A1
20060106717 Randle May 2006 A1
20060108168 Fischer et al. May 2006 A1
20060110063 Weiss May 2006 A1
20060112013 Maloney May 2006 A1
20060115110 Rodriguez Jun 2006 A1
20060115141 Koakutsu et al. Jun 2006 A1
20060118613 McMann Jun 2006 A1
20060124728 Kotovich et al. Jun 2006 A1
20060124730 Maloney Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060144937 Heilper et al. Jul 2006 A1
20060144950 Johnson Jul 2006 A1
20060152576 Kiessling et al. Jul 2006 A1
20060159367 Zeineh et al. Jul 2006 A1
20060161499 Rich et al. Jul 2006 A1
20060161501 Waserstein Jul 2006 A1
20060164682 Lev Jul 2006 A1
20060166178 Driedijk Jul 2006 A1
20060167818 Wentker et al. Jul 2006 A1
20060181614 Yen et al. Aug 2006 A1
20060182331 Gilson et al. Aug 2006 A1
20060182332 Weber Aug 2006 A1
20060186194 Richardson et al. Aug 2006 A1
20060202014 VanKirk et al. Sep 2006 A1
20060206506 Fitzpatrick Sep 2006 A1
20060208059 Cable et al. Sep 2006 A1
20060210138 Hilton et al. Sep 2006 A1
20060210192 Orhun Sep 2006 A1
20060212391 Norman et al. Sep 2006 A1
20060212393 Brown Sep 2006 A1
20060214940 Kinoshita Sep 2006 A1
20060215204 Miyamoto et al. Sep 2006 A1
20060215230 Borrey et al. Sep 2006 A1
20060221198 Fry et al. Oct 2006 A1
20060221415 Kawamoto Oct 2006 A1
20060222260 Sambongi et al. Oct 2006 A1
20060229976 Jung Oct 2006 A1
20060229986 Corder Oct 2006 A1
20060229987 Leekley Oct 2006 A1
20060238503 Smith Oct 2006 A1
20060242062 Peterson Oct 2006 A1
20060242063 Peterson Oct 2006 A1
20060248009 Hicks et al. Nov 2006 A1
20060249567 Byrne Nov 2006 A1
20060255124 Hoch Nov 2006 A1
20060270421 Phillips Nov 2006 A1
20060273165 Swift et al. Dec 2006 A1
20060274164 Kimura et al. Dec 2006 A1
20060279628 Fleming Dec 2006 A1
20060282383 Doran Dec 2006 A1
20060289630 Updike et al. Dec 2006 A1
20060291744 Ikeda et al. Dec 2006 A1
20070002157 Shintani et al. Jan 2007 A1
20070005467 Haigh et al. Jan 2007 A1
20070013721 Vau et al. Jan 2007 A1
20070016796 Singhal Jan 2007 A1
20070019243 Sato Jan 2007 A1
20070022053 Waserstein Jan 2007 A1
20070027802 VanDeburg et al. Feb 2007 A1
20070030357 Levien et al. Feb 2007 A1
20070030363 Cheatle et al. Feb 2007 A1
20070031022 Frew Feb 2007 A1
20070038561 Vancini et al. Feb 2007 A1
20070041629 Prakash et al. Feb 2007 A1
20070050292 Yarbrough Mar 2007 A1
20070053574 Verma et al. Mar 2007 A1
20070058851 Quine Mar 2007 A1
20070058874 Sunao et al. Mar 2007 A1
20070063016 Myatt Mar 2007 A1
20070064991 Douglas et al. Mar 2007 A1
20070065143 Didow et al. Mar 2007 A1
20070075772 Kokubo Apr 2007 A1
20070076940 Goodall et al. Apr 2007 A1
20070076941 Carreon et al. Apr 2007 A1
20070077921 Hayashi Apr 2007 A1
20070080207 Williams Apr 2007 A1
20070082700 Landschaft Apr 2007 A1
20070084911 Crowell Apr 2007 A1
20070086642 Foth Apr 2007 A1
20070086643 Spier Apr 2007 A1
20070094088 Mastie Apr 2007 A1
20070094140 Riney et al. Apr 2007 A1
20070100748 Dheer May 2007 A1
20070110277 Hayduchok et al. May 2007 A1
20070116364 Kleihorst et al. May 2007 A1
20070118472 Allen-Rouman et al. May 2007 A1
20070118747 Pintsov et al. May 2007 A1
20070122024 Haas et al. May 2007 A1
20070124241 Newton May 2007 A1
20070127805 Foth et al. Jun 2007 A1
20070129955 Dalmia Jun 2007 A1
20070130063 Jindia Jun 2007 A1
20070131758 Mejias et al. Jun 2007 A1
20070136078 Plante Jun 2007 A1
20070136198 Foth et al. Jun 2007 A1
20070138255 Carreon et al. Jun 2007 A1
20070140545 Rossignoli Jun 2007 A1
20070140594 Franklin Jun 2007 A1
20070143208 Varga Jun 2007 A1
20070150337 Hawkins et al. Jun 2007 A1
20070154098 Geva et al. Jul 2007 A1
20070156438 Popadic et al. Jul 2007 A1
20070168265 Rosenberger Jul 2007 A1
20070168283 Alvarez et al. Jul 2007 A1
20070171288 Inoue Jul 2007 A1
20070172107 Jones et al. Jul 2007 A1
20070172148 Hawley Jul 2007 A1
20070175977 Bauer et al. Aug 2007 A1
20070179883 Questembert Aug 2007 A1
20070183000 Eisen et al. Aug 2007 A1
20070183652 Backstrom et al. Aug 2007 A1
20070183741 Lerman et al. Aug 2007 A1
20070194102 Cohen Aug 2007 A1
20070198432 Pitroda et al. Aug 2007 A1
20070203708 Polyon et al. Aug 2007 A1
20070206877 Wu et al. Sep 2007 A1
20070208816 Baldwin et al. Sep 2007 A1
20070214086 Homoki Sep 2007 A1
20070217669 Swift et al. Sep 2007 A1
20070233525 Boyle Oct 2007 A1
20070233585 Simon et al. Oct 2007 A1
20070235518 Mueller et al. Oct 2007 A1
20070235520 Smith et al. Oct 2007 A1
20070241179 Davis Oct 2007 A1
20070244782 Chimento Oct 2007 A1
20070244811 Tumminaro Oct 2007 A1
20070246525 Smith et al. Oct 2007 A1
20070251992 Sharma et al. Nov 2007 A1
20070255652 Tumminaro Nov 2007 A1
20070255653 Tumminaro Nov 2007 A1
20070255662 Tumminaro Nov 2007 A1
20070258634 Simonoff Nov 2007 A1
20070262137 Brown Nov 2007 A1
20070262148 Yoon Nov 2007 A1
20070268540 Gaspardo et al. Nov 2007 A1
20070271182 Prakash et al. Nov 2007 A1
20070278286 Crowell et al. Dec 2007 A1
20070288380 Starrs Dec 2007 A1
20070288382 Narayanan et al. Dec 2007 A1
20070295803 Levine et al. Dec 2007 A1
20070299928 Kohli et al. Dec 2007 A1
20080002911 Eisen Jan 2008 A1
20080010204 Rackley III et al. Jan 2008 A1
20080013831 Aoki Jan 2008 A1
20080021802 Pendelton Jan 2008 A1
20080040280 Davis et al. Feb 2008 A1
20080046362 Easterly Feb 2008 A1
20080052182 Marshall Feb 2008 A1
20080059376 Davis Mar 2008 A1
20080062437 Rizzo Mar 2008 A1
20080063253 Wood Mar 2008 A1
20080065524 Matthews et al. Mar 2008 A1
20080068674 McIntyre Mar 2008 A1
20080069427 Liu Mar 2008 A1
20080071679 Foley Mar 2008 A1
20080071721 Wang Mar 2008 A1
20080073423 Heit et al. Mar 2008 A1
20080080760 Ronca Apr 2008 A1
20080086420 Gilder et al. Apr 2008 A1
20080086421 Gilder Apr 2008 A1
20080086770 Kulkarni et al. Apr 2008 A1
20080091599 Foss, Jr. Apr 2008 A1
20080097899 Jackson et al. Apr 2008 A1
20080097907 Till et al. Apr 2008 A1
20080103790 Abernethy May 2008 A1
20080103967 Ackert et al. May 2008 A1
20080113674 Baig May 2008 A1
20080114739 Hayes May 2008 A1
20080115066 Pavley et al. May 2008 A1
20080116257 Fickling May 2008 A1
20080117991 Peddireddy May 2008 A1
20080119178 Peddireddy May 2008 A1
20080133411 Jones et al. Jun 2008 A1
20080140552 Blaikie Jun 2008 A1
20080140579 Sanjiv Jun 2008 A1
20080147549 Rathbun et al. Jun 2008 A1
20080155672 v Jun 2008 A1
20080156438 Stumphauzer et al. Jul 2008 A1
20080162319 Breeden et al. Jul 2008 A1
20080162320 Mueller et al. Jul 2008 A1
20080162350 Allen-Rouman et al. Jul 2008 A1
20080162371 Rampell et al. Jul 2008 A1
20080177659 Lacey et al. Jul 2008 A1
20080180750 Feldman Jul 2008 A1
20080192129 Walker Aug 2008 A1
20080205751 Mischler Aug 2008 A1
20080208727 McLaughlin et al. Aug 2008 A1
20080214180 Cunningham et al. Sep 2008 A1
20080219543 Csulits Sep 2008 A1
20080245869 Berkun et al. Oct 2008 A1
20080247629 Gilder Oct 2008 A1
20080247655 Yano Oct 2008 A1
20080249931 Gilder et al. Oct 2008 A1
20080249951 Gilder et al. Oct 2008 A1
20080250196 Mori Oct 2008 A1
20080262950 Christensen et al. Oct 2008 A1
20080262953 Anderson Oct 2008 A1
20080275821 Bishop et al. Nov 2008 A1
20080301441 Calman et al. Dec 2008 A1
20080304769 Hollander et al. Dec 2008 A1
20080316542 Mindrum et al. Dec 2008 A1
20090024520 Drory et al. Jan 2009 A1
20090046938 Yoder Feb 2009 A1
20090060396 Blessan et al. Mar 2009 A1
20090066987 Inokuchi Mar 2009 A1
20090076921 Nelson et al. Mar 2009 A1
20090092309 Calman et al. Apr 2009 A1
20090094148 Gilder et al. Apr 2009 A1
20090108080 Meyer Apr 2009 A1
20090110281 Hirabayashi Apr 2009 A1
20090114716 Ramachandran May 2009 A1
20090132813 Schibuk May 2009 A1
20090141962 Borgia et al. Jun 2009 A1
20090164350 Sorbe et al. Jun 2009 A1
20090164370 Sorbe et al. Jun 2009 A1
20090166406 Pigg et al. Jul 2009 A1
20090167870 Caleca et al. Jul 2009 A1
20090171723 Jenkins Jul 2009 A1
20090171795 Clouthier et al. Jul 2009 A1
20090171819 Emde et al. Jul 2009 A1
20090171825 Roman Jul 2009 A1
20090173781 Ramachadran Jul 2009 A1
20090176511 Morrison Jul 2009 A1
20090185241 Nepomniachtchi Jul 2009 A1
20090185737 Nepomniachtchi Jul 2009 A1
20090185738 Nepomniachtchi Jul 2009 A1
20090190823 Walters Jul 2009 A1
20090192938 Amos Jul 2009 A1
20090212929 Drory et al. Aug 2009 A1
20090222347 Whitten Sep 2009 A1
20090236413 Mueller et al. Sep 2009 A1
20090240574 Carpenter Sep 2009 A1
20090240620 Kendrick et al. Sep 2009 A1
20090252437 Li Oct 2009 A1
20090254447 Blades Oct 2009 A1
20090257641 Liu et al. Oct 2009 A1
20090263019 Tzadok et al. Oct 2009 A1
20090271287 Halpern Oct 2009 A1
20090281904 Pharris Nov 2009 A1
20090284637 Parulski et al. Nov 2009 A1
20090290751 Ferman et al. Nov 2009 A1
20090292628 Dryer et al. Nov 2009 A1
20090313167 Dujari et al. Dec 2009 A1
20090319425 Tumminaro et al. Dec 2009 A1
20090327129 Collas et al. Dec 2009 A1
20100007899 Lay Jan 2010 A1
20100008579 Smimov Jan 2010 A1
20100016016 Brundage et al. Jan 2010 A1
20100027679 Sunahara et al. Feb 2010 A1
20100030687 Panthaki et al. Feb 2010 A1
20100038839 DeWitt et al. Feb 2010 A1
20100047000 Park et al. Feb 2010 A1
20100057578 Blair et al. Mar 2010 A1
20100061446 Hands et al. Mar 2010 A1
20100069093 Morrison Mar 2010 A1
20100069155 Schwartz Mar 2010 A1
20100076890 Low Mar 2010 A1
20100078471 Lin et al. Apr 2010 A1
20100078472 Lin et al. Apr 2010 A1
20100082468 Low et al. Apr 2010 A1
20100082470 Walach Apr 2010 A1
20100112975 Sennett May 2010 A1
20100128131 Tenchio et al. May 2010 A1
20100150424 Nepomniachtchi et al. Jun 2010 A1
20100161408 Karson Jun 2010 A1
20100165015 Barkley et al. Jul 2010 A1
20100198733 Gantman et al. Aug 2010 A1
20100201711 Fillion et al. Aug 2010 A1
20100225773 Lee Sep 2010 A1
20100226559 Najari et al. Sep 2010 A1
20100260408 Prakash et al. Oct 2010 A1
20100262522 Anderson et al. Oct 2010 A1
20100262607 Vassilvitskii Oct 2010 A1
20100274693 Bause et al. Oct 2010 A1
20100287250 Carlson Nov 2010 A1
20100312705 Caruso et al. Dec 2010 A1
20110015963 Chafle Jan 2011 A1
20110016084 Mundy et al. Jan 2011 A1
20110016109 Vassilvitskii Jan 2011 A1
20110054780 Dhanani Mar 2011 A1
20110069180 Nijemcevic et al. Mar 2011 A1
20110082747 Khan Apr 2011 A1
20110083101 Sharon Apr 2011 A1
20110105092 Felt May 2011 A1
20110106675 Perlman May 2011 A1
20110112967 Anderson et al. May 2011 A1
20110112985 Kocmond May 2011 A1
20110170740 Coleman Jul 2011 A1
20110191161 Dai Aug 2011 A1
20110251956 Cantley et al. Oct 2011 A1
20110276483 Saegert et al. Nov 2011 A1
20110280450 Nepomniachtchi et al. Nov 2011 A1
20110285874 Showering et al. Nov 2011 A1
20110310442 Popadic et al. Dec 2011 A1
20120036014 Sunkada Feb 2012 A1
20120045112 Lundblad et al. Feb 2012 A1
20120047070 Pharris Feb 2012 A1
20120052874 Kumar Mar 2012 A1
20120062732 Marman et al. Mar 2012 A1
20120089514 Kraemling et al. Apr 2012 A1
20120098705 Yost Apr 2012 A1
20120099792 Chevion et al. Apr 2012 A1
20120109793 Abeles May 2012 A1
20120113489 Heit et al. May 2012 A1
20120150767 Chacko Jun 2012 A1
20120185388 Pranger Jul 2012 A1
20120185393 Atsmon et al. Jul 2012 A1
20120229872 Dolev Sep 2012 A1
20120230577 Calman et al. Sep 2012 A1
20120296768 Fremont-Smith Nov 2012 A1
20130021651 Popadic et al. Jan 2013 A9
20130120595 Roach et al. May 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130159183 Lopez Jun 2013 A1
20130191261 Chandler Jul 2013 A1
20130198071 Jurss Aug 2013 A1
20130201534 Carlen Aug 2013 A1
20130223721 Nepomniachtchi et al. Aug 2013 A1
20130297353 Strange et al. Nov 2013 A1
20130324160 Sabatellil Dec 2013 A1
20130332004 Gompert et al. Dec 2013 A1
20130332219 Clark Dec 2013 A1
20130346306 Kopp Dec 2013 A1
20130346307 Kopp Dec 2013 A1
20140010467 Mochizuki et al. Jan 2014 A1
20140032406 Roach et al. Jan 2014 A1
20140037183 Gorski et al. Feb 2014 A1
20140067661 Elischer Mar 2014 A1
20140156501 Howe Jun 2014 A1
20140197922 Stanwood et al. Jul 2014 A1
20140203508 Pedde Jul 2014 A1
20140207673 Jeffries Jul 2014 A1
20140207674 Schroeder Jul 2014 A1
20140236820 Carlton et al. Aug 2014 A1
20140244476 Shvarts Aug 2014 A1
20140258169 Wong et al. Sep 2014 A1
20140279453 Belchee et al. Sep 2014 A1
20140313335 Koravadi Oct 2014 A1
20140351137 Chisholm Nov 2014 A1
20140374486 Collins, Jr. Dec 2014 A1
20150039528 Minogue et al. Feb 2015 A1
20150090782 Dent et al. Apr 2015 A1
20150134517 Cosgray May 2015 A1
20150235484 Kraeling et al. Aug 2015 A1
20150244994 Jang et al. Aug 2015 A1
20150294523 Smith Oct 2015 A1
20150348591 Kaps et al. Dec 2015 A1
20160026866 Sundaresan Jan 2016 A1
20160034590 Endras et al. Feb 2016 A1
20160142625 Weksler et al. May 2016 A1
20160189500 Kim et al. Jun 2016 A1
20160307388 Williams Oct 2016 A1
20160335816 Thoppae et al. Nov 2016 A1
20170039637 Wandelmer Feb 2017 A1
20170068421 Carlson Mar 2017 A1
20170132583 Nair May 2017 A1
20170146602 Samp et al. May 2017 A1
20170229149 Rothschild et al. Aug 2017 A1
20170263120 Durie, Jr. et al. Sep 2017 A1
20170337610 Beguesse Nov 2017 A1
20180025251 Welinder Jan 2018 A1
20180108252 Pividori Apr 2018 A1
20180197118 McLaughlin Jul 2018 A1
20190026577 Hall et al. Jan 2019 A1
20190122222 Uechi Apr 2019 A1
20190311227 Kriegman Oct 2019 A1
20200311844 Luo Oct 2020 A1
20210097615 Gunn, Jr. Apr 2021 A1
Foreign Referenced Citations (21)
Number Date Country
2619884 Mar 2007 CA
1897644 Jan 2007 CN
1967565 May 2007 CN
0 984 410 Mar 2000 EP
1 855 459 Nov 2007 EP
2004-23158 Jan 2004 JP
2004-23158 Jan 2004 JP
3708807 Oct 2005 JP
2006-174105 Jun 2006 JP
20040076131 Aug 2004 KR
WO 9614707 May 1996 WO
WO 9837655 Aug 1998 WO
WO 0161436 Aug 2001 WO
WO 0161436 Aug 2001 WO
WO 2004008350 Jan 2004 WO
WO 2005043857 May 2005 WO
WO 2005124657 Dec 2005 WO
WO 2006075967 Jul 2006 WO
WO 2006086768 Aug 2006 WO
WO 2006136958 Dec 2006 WO
WO 2007024889 Mar 2007 WO
Non-Patent Literature Citations (494)
Entry
“Accept “Customer Not Present”Checks,” Accept Check Online, http://checksoftware.com, cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Adjusting Brightness and Contrast”, www.eaglesoftware.com/adjustin.htm, retrieved on May 4, 2009 (4 pgs).
“Best practices for producing quality digital image files,” Digital Images Guidelines, http://deepblue.lib.umich.edu/bitstream/2027.42/40247/1/Images-Best_Practice.pdf, downloaded 2007 (2 pgs).
“Chapter 7 Payroll Programs,” Uniform Staff Payroll System, http://www2.oecn.k12.oh.us/www/ssdt/usps/usps_user_guide_005.html, cited in U.S. Pat. No. 7,900,822, as dated 2007 (9 pgs).
“Check 21—The check is not in the post”, RedTitan Technology 2004 http://www.redtitan.com/check21/htm (3 pgs).
“Check 21 Solutions,” Columbia Financial International, Inc. http://www.columbiafinancial.us/check21/solutions.htm, cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
“Check Fraud: A Guide to Avoiding Losses”, All Net, http://all.net/books/audit/checkfraud/security.htm, cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Clearing House Electronic Check Clearing System (CHECCS) Operating Rules,” An IP.com Prior Art Database Technical Disclosure, Jul. 29, 2015 (35 pgs).
“Compliance with Regulation CC”, http./www/federalreserve.gov/Pubs/regcc/regcc.htm, Jan. 24, 2006 (6 pgs).
“Custom Personalized Bank Checks and Address Labels” Checks Your Way Inc., http://www.checksyourway.com/htm/web_pages/faq.htm, cited in U.S. Pat. No. 7,900,822, as dated 2007 (6 pgs).
“Deposit Now: Quick Start User Guide,” BankServ, 2007, 29 pages.
“Direct Deposit Application for Payroll”, Purdue University, Business Office From 0003, http://purdue.edu/payroll/pdf/dierctdepositapplication.pdf, Jul. 2007 (2 pgs).
“Direct Deposit Authorization Form”, www.umass.edu/humres/library/DDForm.pdf, May 2003 (3 pgs).
“Direct Deposit,” University of Washington, http://www.washington.edu/admin/payroll/directdeposit.html, cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Electronic Billing Problem: The E-check is in the mail” American Banker—vol. 168, No. 95, May 19, 2003 (4 pgs).
“First Wireless Handheld Check and Credit Card Processing Solution Launched by Commerciant®, MobileScape® 5000 Eliminates Bounced Checks, Enables Payments Everywhere,” Business Wire, Mar. 13, 2016, 3 pages.
“Frequently Asked Questions” Bank of America, http://www.bankofamerica.com/deposits/checksave/index.cfm?template-lc_faq_bymail, cited in U.S. Pat. No. 7,900,822, as dated 2007 (2 pgs).
“Full Service Direct Deposit”, www.nonprofitstaffing.com/images/upload/ditdepform.pdf. cited in U.S. Pat. No. 7,900,822, as dated 2001, (2 pgs).
“How to Digitally Deposit a Check Image”, Smart Money Daily, Copyright 2008 (5 pgs).
“ImageNet Mobile Deposit Provides Convenient Check Deposit and Bill Pay to Mobile Consumers,” Miltek Systems, 2008 (2 pgs).
“It's the easiest way to Switch banks”, LNB, http://www.inkby.com/pdf/LNBswitch-kit10-07.pdf cited in U.S. Pat. No. 7,996,316, as dated 2007 (7 pgs).
“Lesson 38—More Bank Transactions”, Turtle Soft, http://www.turtlesoft.com/goldenseal-software-manual.lesson38.htm, cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
“Middleware”, David E. Bakken, Encyclopedia of Distributed Computing, Kluwer Academic Press, 2001 (6 pgs).
“Mitek Systems Announces Mobile Deposit Application for Apple iPhone,” http://prnewswire.com/cgi-bin/stories/pl?ACCT=104&STORY=/www/story/10-01- . . . , Nov. 25, 2008 (2 pgs).
“NOVA Enhances Electronic Check Service to Benefit Multi-Lane Retailers,” Business Wire, Nov. 28, 2006, 2 pages.
“Personal Finance”, PNC, http://www.pnc.com/webapp/unsec/productsandservice.do?sitearea=/PNC/home/personal/account+services/quick+switch/quick+switch+faqs, cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
“Quicken Bill Pay”, Retrieved from the Internet on Nov. 27, 2007 at: <URL:http://quicken intuit.com/quicken-bill-pay-jhtml>, 2 pgs.
“Refractive Index” Wikipedia, the free encyclopedia; http://en.wikipedia.org./wiki/refractiveindex.com Oct. 16, 2007 (4 pgs).
“Remote check deposit is the answer to a company's banking problem,” Daily Breeze, Torrance, CA, Nov. 17, 2006, 2 pgs.
“Remote Deposit Capture”, Plante & Moran, http://plantemoran.com/industries/fincial/institutions/bank/resources/community+bank+advisor/2007+summer+issue/remote+deposit+capture.htm, cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Remote Deposit” National City, http://www.nationalcity.com/smallbusiness/cashmanagement/remotedeposit/default.asp; cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
“Save on ATM Fees”, RedEye Edition, Chicago Tribune, Chicago, IL Jun. 30, 2007 (2 pgs).
“SNB Check Capture: SmartClient User's Guide,” Nov. 2006, 21 pgs.
“Start to Simplify with Check Imaging a Smarter Way to Bank”, Retrieved from the Internet on Nov. 27, 2007, at <URL:http://www.midnatbank.com/Internet%20Banking/Internet_Banking.html>, 3 pgs.
“Switching Made Easy,” Bank of North Georgia, http://www.banknorthgeorgia.com/cmsmaster/documents/286/documents616.pdf, 2007 (7 pgs).
“Two Words Every Business Should Know: Remote Deposit,” Canon, http://www.rpsolutions.com/rpweb/pdfs/canon_rdc.pdf, 2005 (7 pgs).
“Virtual Bank Checks”, Morebusiness.com, http://www.morebusiness.com/running_yourbusiness/businessbits/d908484987.brc, cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“WallStreetGrapevine.com” Stocks on the Rise: JADG, BKYI, MITK; Mar. 3, 2008 (4 pgs).
“What is check Fraud”, National Check Fraud Center, http://www.ckfraud.org/ckfraud.html, cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
“Exchangeable image file format for digital still cameras: Exif Version 2.2,” Standard of Electronics and Information Technology Industries Associate, JEITA CP-3451, Technical Standardization Committee on AV & IT Stroage Systems and Equpments, Japan Electronics and Information Technology Industries Association, Apr. 2002 (154 pgs). (retrieved from: http://www.exif.org/Exif2-2.PDF).
“Getting Started with ICLs aka X9.37 Files”, All My Papers, May 2, 2006, 39 pgs.
“Machine Accepts Bank Deposits”, New York Times, Apr. 12, 1961, 1 pg.
“Sprint PCS Vision Guide”, 2005, 86 pgs.
“Vodafone calls on moviles to go live!”, 2002, 8 pgs.
12 CFR § 229.51 and Appendix D to Part 229 (Jan. 1, 2005 edition), 3 pgs.
149 Cong. Rec. H9289, Oct. 8, 2003, 6 pgs.
ABA Routing System Transit Number, Wikipedia, dated Sep. 27, 2006, 3 pgs.
About Network Servers, GlobalSpec (retrieved from https://webarchive.org/web/20051019130842/http://globalspec.com80/LearnMore/Networking_Communication_Equipment/Networking_Equipment/Network_Servers_(“GlobalSpec”).
Accredited Standards Committee Technical Report TR 33-2006, dated Aug. 28, 2006, 75 pgs.
Affinity Federal Credit Union, “Affinity Announces Online Deposit,” Aug. 4, 2005 (1 pg).
Albrecht, W. Steve, “Check Kiting: Detection, Prosecution and Prevention,” The FBI Law Enforcement Bulletin, Nov. 1, 1993 (6 pgs).
Alves, Vander and Borba, Paulo; “Distributed Adapters Pattern: A Design for Object-Oriented Distributed Applications”; First Latin American Conference on Pattern Languages of Programming; Oct. 2001; pp. 132-142; Rio de Janeiro, Brazil (11 pgs).
Amber Avalona-Butler/Paraglide, “At Your Service: Best iPhone Apps for Military Lifestyle,” Jul. 9, 2010 (2 pgs).
Anderson, Milton M. “FSML and Echeck”, Financial Services Technology Consortium, 1999 (17 pgs).
Andrew S. Tanenbaum, Modern Operating Systems, Second Edition (2001).
ANS X9.100-140-2004, “Specification for an Image Replacement document—IRD”, American Standard for Financial Services, Oct. 1, 2004, 15 pgs.
ANSI News, Check 21 Goes Into Effect Oct. 28, 2004, dated Oct. 25, 2004, 1 pg.
ANSI X9.7-1999(R2007), Bank Check Background and Convenience Amount Field Specification, dated Jul. 11, 2007, 86 pgs.
ANSI, “Return Reasons for Check Image Exchange of IRDS”, dated May 6, 2016, 23 pgs.
ANSI, “Specifications For Electronic Exchange of Check and Image Data”, dated Jul. 11, 2006, 230 pgs.
Apple Announces the New iPhone 3GS—The Fastest, Most Powerful iPhone Yet, Jun. 8, 2009, located on the Internet at: http://www.apple.com.rensroom/2009/06/08Apple-Announces-the-New-iPhone-3GS-The Fastest-Most-Powerful-iPhone-Yet, 4 pgs.
Apple Reinvents the Phone with iPhone, Jan. 2007, located on the Internet at: https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/, 4 pgs.
Aradhye, Hrishikesh B., “A Generic Method for Determining Up/Down Orientation of Text in Roman and Non-Roman Scripts,” Pattern Recognition Society, Dec. 13, 2014, 18 pages.
Archive Index Systems; Panini My Vision X-30 or VX30 or X30 © 1994-2008 Archive Systems, Inc. P./O. Box 40135 Bellevue, WA USA 98015 (2 pgs).
Arnold et al, The Java Programming Language, Fourth Edition (2005).
ASCX9, “Specification for Electronic Exchange of Check and Image Data”, date Mar. 31, 2003, 156 pgs.
Askey, Canon EOS 40D Review (pts. 1,4,10), Digital Photography Review, located on the Internet at: https://www.dpreview.com/reviews/canoneos40d, 24 pgs.
Askey, Leica Digilux 2 Review (pts. 1,3,7), Digital Photography Review, May 20, 2004, located on the Internet at: : https://www.dpreview.com/reviews/leicadigilux2, 20 pgs.
Askey, Nikon D300 In-depth Review (pts. 1,3,9), Digital Photography Review, Mar. 12, 2008, located on the Internet at: : https://www.preview.com/reviews/nikond300, 24 pgs.
Askey, Panasonic Lumix DMC-L1 Review (pts. 1,3,7), Digital Photography Review, Apr. 11, 2007, located on the Internet at: https://www.dpreview.com/reviews/panasonicdmc11, 24 pgs.
Askey, Sony Cyber-shot DSC-R1 Review (pts, 1,3,7), Digital Photography Review, Dec. 6, 2005, located on the Internet at: http://www.dpreview.com.reviews/sonydscr1, 24 pgs.
Associate of German Banks, SEPA 2008: Uniform PAyment Instruments for Europe, Berlin, cited in U.S. Pat. No. 7,900,822, as dated Jul. 2007, Bundesverbankd deutscher banken ev, (42 pgs).
Automated Clearing Houses (ACHs), Federal Reserve Bank of New York (May 2000) available at: https://www.newyorkfed.org/aboutthefed/fedpoint/fed31.html, (attached as Exhibit 12 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 4 pgs.
Automated Merchant Systems, Inc., “Electornic Check Conversion,” http://www.automatedmerchant.com/electronic_check_conversion.cfm, 2006, downloaded Oct. 18, 2006 (3 pgs).
Bank Systems & Technology, United Article, May 1, 2006, http://www.banktech.com/showarticle.jhtml?articleID=187003126, “Are you Winning in the Payment World?” (4 pgs).
Bankers' Online, “Training Page: Learning the Bank Numbering System”, Copyright 2004, 2 pgs.
BankServ, “DepositNow: What's the difference?” cited in U.S. Pat. No. 7,970,677, as dated 2006, (4 pgs).
BankServ, Product Overview, http://www.bankserv.com/products/remotedeposit.htm, cited in patent No. 7,970,677, as dated 2006, (3 pgs).
Berman, How Hitchcock Turned a Small Budget Into a Great Triumph, Time.com, Apr. 29, 2015, located on the Internet at: http://time.com/3823112/alfred-hitchcock-shadow-of-a-doubt, 1 pg.
Big Red Book, Adobe Systems Incorporated, copyright 2000, (attached as Exhibit 27 from the Defendant Wells Fargo Bank, N.A's Answer dated Aug. 14, 2018), 45 pgs.
Bills, Steve, “Automated Amount Scanning Is Trend in Remote-Deposit,” American Banker, New York, NY, Aug. 30, 2005, (3 pgs).
Blafore, Bonnie “Lower Commissions, Fewer Amenities”, Better Investing Madison Heights: Feb. 2003, vol. 52, Iss. 6, (4 pgs).
BLM Technologies, “Case Study: Addressing Check 21 and RDC Error and Fraud Threats,”Remote Deposit Capture News Articles from Jun. 11, 2007, Retrieved from http://www.remotedepositcapture.com/News/june_11_2007.htm on Feb. 19, 2008 (5 pgs).
Blue Mountain Consulting, from URL: www.bluemontainconsulting.com, cited in U.S. Pat. No. 7,900,822, as dated Apr. 26, 2006 (3 pgs).
Board of Governors of the federal reserve system, “Report to the Congress on the Check Clearing for the 21st Century Act of 2003” Apr. 2007, Submitted to Congress pursuant to section 16 of the Check Clearing for the 21st Century Act of 2003, (59 pgs).
BrainJar Validation Algorithms, archived on Mar. 16, 2016 from BrainJar.com, 2 pgs.
Braun, Tim, “Camdesk—Towards Portable and Easy Document Capture,” Image Understanding and Pattern Recognition Research Group, Department of Computer Science, University of Kaiserslautern, Technical Report, Mar. 29, 2005 (64 pgs). (Retrieved from: https://pdfs.semanticscholar.org/93b2/ea0d12f24c91f3c46fa1c0d58a76bb132bd2.pdf).
Bruene, Jim; “Check Free to Enable In-Home Remote Check Deposit for Consumers and Small Business”, NetBanker.com, Financial Insite, Inc., http://www.netbanker.com/2008/02/checkfree_to_enableinhome_rem.html, Feb. 5, 2008 (3 pgs).
Bruene, Jim; “Digital Federal Credit Union and Four Others Offer Consumer Remote Deposit Capture Through EasCorp”, NetBanker—Tracking Online Finance, www.netbanker.com/2008/04/digital_federal_credit_union_a.html, Apr. 13, 2008 (3 pgs).
Bruno, M., “Instant Messaging,” Bank Technology News, Dec. 2002 (3 pgs).
Burnett, J. “Depository Bank Edorsement Requirements,” BankersOnline.com, http://www.bankersonline.com/cgi-bin/printview/printview.pl, Jan. 6, 2003 (3 pgs).
Callahan, J., “The first camera phone was sold 20 years ago, and it's not what you expect”, Adroid Authority, 2019, 5 pgs.
Canon EOS 40D Digital Camera Instruction Manual, located on the Internet at: http://gdlp01.c-wss.com/gds/6/0900008236/01/EOS40D_HG_EN.pdf (attached as Exhibit 6 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 38 pgs.
Canon White Paper, “Two Words Every Business Should Know—Remote Deposit”, dated 2005, 7 pgs.
Canon, ImageFormula CR-25/CR-55, “Improve Your Bottom Line with Front-Line Efficiencies”, 0117W117, 1207-55/25-1 OM-BSP, cited in U.S. Pat. No. 7,949,587 as dated 2007. (4 pgs).
Caplan, J. et al., “Most Influential Gadgets and Gizmos 2002: Sanyo SCP-5300”, 2002, 1 pg.
Carrubba, P. et al., “Remote Deposit Capture: A White Paper Addressing Regulatory, Operational and Risk Issues,” NetDeposit, Inc., 2006 (11 pgs).
CBM 2019-00027, Declaration of Bharat Prasad, dated Jul. 8, 2019, 32 pgs.
CBM 2019-00027, Patent Owner Preliminary Response and Exhibits 2001-1042, dated Jul. 8, 2019, 91 pgs.
CBM 2019-00028, “Motorola, Palm collaborate on smart phone”, Copyright 2000 by Crin Communications, Inc., 1 pg.
CBM 2019-00028, Aspire 9800 Series User Guide, Copyright 2006 by Acer International, 122 pgs.
CBM 2019-00028, Dell XPS M1210 Owner's Manual Copyright 2006 by Dell Inc., 192 pgs.
CBM 2019-00028, Estridge, Bonnie “Is your phone smart enough?: The series that cute through the technobabble to bring you the best advice on the latest gadgets”, Copyright 2006 by XPRESS—A1 Nsr Media, 3 pgs.
CBM 2019-00028, Lawler, Ryan “Apple shows Intel-based Macs, surge revenue”, Copyright 2006 by The Yomiuri Shimbun, 2 pgs.
CBM 2019-00028, Nasaw, Daniel “Viruses Pose threat to “Smart”Cellphones—Computer Programs Could Cripple Devices and Shut Down Wireless Networks”, Copyright 2004 by Factiva, 2 pgs.
CBM 2019-00028, Seitz, Patrick “Multifunction Trend Shaking Up The Handheld Device Industry; Solid Sales Expected in 2004; PDA, handset, camera—one single, small product can fill a variety of role”, Copyright 2004 Investor's Business Daily, Inc., 3 pgs.
CBM 2019-00028, United Services Automobile Associates (“USAA”)'s Patent Owner Preliminary Response, dated Jul. 8, 2019, 73 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Decision Dismissing Petition and Terminating Proceeding 37 C.F.R. § 42.5(a) and 42.71(a), dated Apr. 26, 2019, 5 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Matthew Calman In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Tim Crews In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 75 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Updated Exhibit List, dated Mar. 19, 2019, 2 pgs.
CBM2019-00003 U.S. Pat. No. 8,699,779, Declaration of Matthew Calman in Support of Patent Owner Preliminary Response, dated Mar. 4, 2019, 15 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 28 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 14, dated Apr. 10, 2019, 10 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Petitioner's Updated Exhibits List, dated May 14, 2019, 7 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 91 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 42.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00003 U.S. Pat. No. 8,977,571, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated May 15, 2019, 33 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Matthew Calman In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Tim Crews In Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Defendant's Claim Construction Brief and supporting exhibits, United Services Autombile Association v. Wells Fargo Bank, N.A., Case No. 2:18-cv-245, dated Apr. 25, 2019, 36 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper 14, dated Apr. 30, 2019, 7 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Autombile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 99 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Autombile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 43.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779 Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 27 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 15, dated May 1, 2019, 7 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, United Services Autombile Association (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 103 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 147 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Apr. 8, 2019, 3 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Petition For Covered Business Method Review of Claims 1-3, 5-9, 11-16 and 18 of U.S. Pat. No. 9,224,136, dated Mar. 28, 2019, 93 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 94 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Notice of Filing Date Accorded To Petition and Time For Filing Patent owner Preliminary Response for U.S. Pat. No. 10,013,681, dated Apr. 8, 2019, 3 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petition For Covered Business Method Review of Claims 1-30 of U.S. Pat. No. 10,013,681, dated Mar. 28, 2019, 99 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petitioner's Updated Exhibit List (as of Apr. 1, 2019) for U.S. Pat. No. 10,013,681, dated Apr. 1, 2019, 5 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions And Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs.
CBM2019-00028, “64 Million Smart Phones Shipped Worldwide in 2006” Canalys Newsroom, 2006, 3 pgs.
CBM2019-00028, “Smarter Than Your Average Phone”, Copyright 2006 by Factiva, 4 pgs.
CBM2019-00028, 00000 C720w User Manual for Windows Mobile Smart Phone, Copyright 2006, 352 pgs.
CBM2019-00028, 17-Inch MacBook Pro User's Guide, Copyright 2006 by Apple Computer, Inc., 144 pgs.
CBM2019-00028, Burney, Brett “MacBook Pro with Intel processor is fast, innovative”, Copyright 2006 by Plain Dealer Publishing Co., 2 pgs.
CBM2019-00028, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 8, 28 pgs.
CBM2019-00028, Jewell, Mark “Cell Phone Shipments Reach Record 208M” Copyright 2005 by Associated Press, 1 pg.
CBM2019-00028, Malykhina, Elena “Get Smart”, Copyright 2006 by ProQuest Information and Learning Company, 6 pgs.
CBM2019-00028, Nokia 9500 Communicator user Guide, Copyright 2006 by Nokia Corporation, 112 pgs.
CBM2019-00028, Palm Treo 700W Smartphone manual, Copyright 2005 by Palm, Inc., 96 pgs.
CBM2019-00028, Robinson, Daniel, “Client Week—Handsets advance at 3GSM”, Copyright 2004 by VNU Business Publications Ltd., 2 pgs.
CBM2019-00028, Wong, May “HP unveils new mobile computers”, Copyright 2006 by The Buffalo News, 2 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 76 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Petition For Covered Business Method Review of Claims 1-3, 5-14, 16-29 of U.S. Pat. No. 10,013,605, dated Mar. 28, 2019, 88 pgs.
CBM2019-00029, “Dynamism.com: Take tomorrow's tech home today with Dynamism.com: Latest gadgets merge next generation technology with high style design”, Copyright 2006 Normans Media Limited, 2 pgs.
CBM2019-00029, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 17, 2019, 29 pgs.
CBM2019-00029, Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
CBM2019-00029, Hp User Guide, Additional Product Information, Copyright 2006 by Hewlett-Packard Development Company, L.P., 204 pgs.
CBM2019-00029, Palenchar, Joseph, “PDA Phone Adds WIFI VolP, Turn-By- Turn GPS Navigation”, Copyright 2006 by Reed Business Information, 2 pgs.
CBM2019-00029, Pocket PC User Manual, Version 1, dated May 2006 by Microsoft, 225 pgs.
CBM2019-00029, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Jul. 17, 2019, 76 pgs.
CBR online, “Diebold launches ATM depository technology”, Oct. 4, 2007, 5 pgs.
Century Remote Deposit High-Speed Scanner User's Manual Release 2006, (Century Manual), Century Bank, 2006, (32 pgs).
Certificate of Accuracy related to Article entitled, “Deposit checks by mobile” on webpage: https://www.elmundo.es/navegante/2005/07/21/empresas/1121957427.html signed by Christian Paul Scrogum (translator) on Sep. 9, 2021.
Check Clearing for the 21st Century Act Foundation for Check 21 Compliance Training, Federal Financial Institutions Examination Council, (Oct. 16, 2004), available on the Internet at: https://web.archive.org/web/20041016100648/https://www.ffiec.gov/exam/ch eck21/check21foundationdoc.htm, (excerpts attached as Exhibit 20 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 11 pgs.
Chen, Brian et al., iPhone 3GS Trounces Predecessors, Rivals in Web Browser Speed Test, Wired, Jun. 24, 2009, located on the Internet at: www.wired.com/2009.3gs-speed/, 10 pgs.
Cheq Information Technology White Paper, “Teller Scanner Performance and Scanner Design: Camera Position Relative to the Feeder”, dated 2005, 7 pgs.
Chiang, Chuck, The Bulletin, “Remote banking offered”, http://bendbulletin.com/apps/pbcs.dll/article?AID=/20060201/BIZ0102/602010327&templ . . . , May 23, 2008 (2 pgs).
Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 13, 2019, 48 pgs.
Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-CV-366, dated Jul. 29, 2019, 36 pgs.
CNN.com/technology, “Scan, deposit checks from home”, www.cnn.com/2008ITECH/biztech/02/07/check.scanning.ap/index.html, Feb. 7, 2008 (3 pgs).
Constanzo, Chris, “Remote Check Deposit: Wells Captures A New Checking Twist”, Bank Technology News Article—May 2005, www.americanbanker.com/btn_article.html?id=20050502YQ50FSYG (2 pgs).
Consumer Assistance & Information—Check 21 https://www.fdic.gov/consumers/assistance/protection/check21.html (FDIC).
Cormac Herley, “Recursive Method to Extract Rectangular Objects From Scans”, 4 pages, Oct. 2003.
Craig Vaream, “Image Deposit Solutions” Emerging Solutions for More Efficient Check Processing, Nov. 2005, 16 pages.
Craig, Ben, “Resisting Electronic Payment Systems: Burning Down the House?”, Federal Reserve Bank of Cleveland, Jul. 1999 (4 pgs).
Creativepaymentsolutions.com, “Creative Payment Solutions—Websolution,” www.creativepaymentsolution.com/cps/financialservices/websolution/default.html, Copyright 2008, Creative Payment Solutions, Inc. (1 pg).
Credit Union Journal, “The Ramifications of Remote Deposit Capture Success”, www.cuiournal.com/orintthis.html?id=20080411 EODZT57G, Apr. 14, 2008 (1 pg).
Credit Union Journal, “AFCU Averaging 80 DepositHome Transactions Per Day”, Credit Union Journal, Aug. 15, 2005 (1 pg).
Credit Union Management, “When You wish Upon an Imaging System . . . the Right Selection Process can be the Shining Star,” Credit Union Management, Aug. 1993, printed from the Internet at <http://search.proquest.com/docview/227756409/14138420743684F7722/15?accountid=14 . . . >, on Oct. 19, 2013 (11 pgs).
DCU Member's Monthly—Jan. 2008, “PC Deposit—Deposit Checks from Home!”, http://www.mycreditunionnewsletter.com/dcu/01 08/page1.html, Copyright 2008 Digital Federal Credit Union (2 pgs).
De Jesus, A. et al., “Distributed Check Processing in a Check 21 Environment: An educational overview of the opportunities and challenges associated with implementing distributed check imaging and processing solutions,” Panini, 2004, pp. 1-22.
De Queiroz, Ricardo et al., “Mixed Raster Content (MRC) Model for Compound Image Compression”, 1998 (14 pgs).
Debello, James et al., “RDM and Mitek Systems to Provide Mobile Check Deposit,” Mitek Systems, Inc., San Diego, California and Waterloo, Ontario, (Feb. 24, 2009), 2 pgs.
Declaration of Peter Alexander, Ph.D., CBM2019-0004, Nov. 8, 2018, 180 pgs.
Defendant Wells Fargo Bank, N.A.'s Amended Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Amended Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Mar. 7, 2019, 75 pgs.
Defendant Wells Fargo Bank, N.A.'s Amended Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 12, 2019, 32 pgs.
Defendant Wells Fargo Bank, N.A.'s Second Amended Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Amended Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Aug. 1, 2019, 72 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 111 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
DeYoung, Robert; “The Financial Performance of Pure Play Internet Banks”; Federal Reserve Bank of Chicsago Economic Perspectives; 2001; pp. 60-75; vol. 25, No. 1 (16 pgs).
Dhandra, B.V. et al., “Skew Detection in Binary Image Documents Based on Image Dilation and Region labeling Approach”, IEEE, The 18th International Conference on Pattern Recognition (ICPR'06), 2006, 4 pgs.
Dias, Danilo et al., “A Model for the Electronic Representation of Bank Check”, Brasilia Univ. Oct. 2006 (5 pgs).
Digital Transactions News, “An ACH-Image Proposal For Check Roils Banks and Networks” May 26, 2006 (3 pgs).
Dinan, R.F. et al., “Image Plus High Performance Transaction System”, IBM Systems Journal, 1990 vol. 29, No. 3 (14 pgs).
Doermann, D. et al., “The function of documents”, Image and Vision Computing, vol. 16, 1998, pp. 799-814.
Doermann, David et al., “Progress in Camera-Based Document Image Analysis,” Proceedings of the Seventh International Conference on Document Analysis and Recognition (IDCAR 2003) 0-7695-1960-1/03, 2003, IEEE Computer Society, 11 pages.
Duvall, Mel, “Remote Deposit Capture,” Baseline, vol. 1, Issue 70, Mar. 2007, 2 pgs.
eCU Technologies, “Upost Remote Deposit Solution,” Retrieved from the Internet https://www.eutechnologies.com/products/upost.html, downloaded 2009 (1 pg).
EFT Network Unveils FAXTellerPlus, EFT Network, Inc., www.eftnetwork.com, Jan. 13, 2009 (2 pgs).
ElectronicPaymentProviders, Inc., “FAQs: ACH/ARC, CheckVerification/Conversion/Guarantee, RCK Check Re-Presentment,” http://www.useapp.com/faw.htm, downloaded Oct. 18, 2006 (3 pgs).
Excerpts from American National Standard for Financial Services, ANS, X9.100-140-2004-Specifications for an Image Replacement Document—IRD, Oct. 1, 2004, 16 pgs.
FDIC—Remote Capture: A Primer, 2009, 3 pgs.
FDIC: Check Clearing for the 21st Century act (Check21), FED, Deposit Ins. Corp., Apr. 25, 2016 (retrieved from https://web.archive.org/web/20161005124304/https://www.fdic.gov/consumers/assistance/protection/check21.html (“FDIC”).
Federal Check 21 Act, “New Check 21 Act effective Oct. 28, 2004: Bank No Longer Will Return Original Cancelled Checks,” Consumer Union's FAQ's and Congressional Testimony on Check 21, www.consumerlaw.org.initiatives/content/check21_content.html, cited in U.S. Pat. No. 7,873,200, as dated Dec. 2005 (20 pgs).
Federal Reserve Adoption of DSTU X9.37-2003, Image Cash Letter Customer Documentation Version 1.8, dated Oct. 1, 2008, 48 pgs.
Federal Reserve Bank, “Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services”, 2004, 2 pgs.
Federal Reserve Banks Plan Black-and-White Image Standard and Quality Check, May 2004, 2 pgs.
Federal Reserve Board, “Check Clearing for the 21st Century Act”, FRB, http://www.federalreserve.gov/paymentsystems/trucation/, Mar. 1, 2006 (1 pg).
Federal Reserve System, “12 CFR, Part 229 [Regulation CC; Docket No. R- 0926]: Availability of Funds and Collection of Checks,” Federal Registrar, Apr. 28, 1997, pp. 1-50.
Federal Reserve System, “Part IV, 12 CFR Part 229 [Regulation CC; Docket No. R-1176]: Availability of Funds and Collection of Checks; Final Rule,” Federal Registrar, vol. 69, No. 149, Aug. 4, 2004, pp. 47290-47328.
Fest, Glen., “Patently Unaware” Bank Technology News, Apr. 2006, Retrieved from the internet at URL:http://banktechnews.com/article.html?id=2006403T7612618 (6 pgs).
Fidelity Information Services, “Strategic Vision Embraces Major Changes in Financial Services Solutions: Fidelity's long-term product strategy ushers in new era of application design and processing,” Insight, 2004, pp. 1-14.
Fielding, R. et al., “RFC-2616—Hypertext Transfer Protocol”, Network Working Group, The Internet Society copyright 1999, 177 pgs.
Fisher, Dan M., “Home Banking in the 21st Century: Remote Capture Has Gone Retail”, May 2008 (4 pgs).
Fletcher, Lloyd A., and Rangachar Kasturi, “A robust algorithm for text string separation from mixed text/graphics images”, IEEE transactions on pattern analysis and machine intelligence 10.6 (1988), 910-918 (1988).
Fujisawa, H. et al., “Information Capturing Camera and Developmental Issues”, IEEE Xplore, downloaded on Aug. 18, 2020, 4 pgs.
Furst, Karen et al., “Internet Banking: Developments and Prospects”, Economic and Policy Analysis Working Paper 2000-9, Sep. 2000 (60 pgs).
Garry, M., “Checking Options: Retailers face an evolving landscape for electronic check processing that will require them to choose among several scenarios,” Supermarket News, vol. 53, No. 49, 2005 (3 pgs).
Gates, A History of Wireless Standards, Wi-Fi Back to Basics, Aerohive Blog, Jul. 2015, located on the Internet at: http://blog.aerohive.com/a-history-of-wireless-standards, 5 pgs.
German Shegalov, Diplom-Informatiker, “Integrated Data, Message, and Process Recovery for Failure Masking in Web Services”, Dissertation Jul. 2005 (146 pgs).
Guidelines for Evaluation of Radio Transmission Technologies for IMT-2000, dated 1997, ITU-R-M.1225, located on the Internet at: https://www.itu.int/dmspubrec/itu-r/rec/m/R-REC-M, 1225-0-199702-l!!PDF-E.pdf, 60 pgs.
Gupta, Amar et al., “An Integrated Architecture for Recognition of Totally Unconstrained Handwritten Numerals”, WP#3765, Jan. 1993, Productivity from Information Technology “Profit” Research Initiative Sloan School of Management (20 pgs).
Gupta, Maya R. et al., “OCR binarization and image pre-processing for searching historical documents,” Pattern Recognition, vol. 40, No. 2, Feb. 2007, pp. 389-397.
Gutierrez, L., “Innovation: From Campus to Startup”, Business Watch, 2008, 2 pgs.
Hale, J., “Picture this: Check 21 uses digital technology to speed check processing and shorten lag time,” Columbus Business First, http://columbus.bizjournals.com/columbus/stories/2005/03/14focus1.html, downloaded 2007 (3 pgs).
Halonen et al., GSM, GPRS, and EDGE Performance: Evolution Towards 3G/UMTS, Second Edition (2003).
Hartly, Thomas, “Banks Check Out New Image”, Business First, Buffalo: Jul. 19, 2004, vol. 20, Issure 43, (3 pgs).
Heckenberg, D. “Using Mac OS X for Real-Time Image Processing” Oct. 8, 2003 (15 pgs).
Hello Ocean User Manual, located on the Internet at: https://standupwireless.com/wp-content/uploads/2017/04/Manual_PAN-TECH_OCEAN.pdf (excerpts attached as Exhibit 10 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018) 76 pgs.
Herley, Cormac, “Efficient Inscribing of Noisy Rectangular Objects in Scanned Images,” 2004 International Conference on Image Processing, 4 pages.
Heron, Advanced Encryption Standard (AES), 12 Network Security 8 (2009).
Higgins, Ray et al., “Working With Image Cash Letters (ISLs) X9.37, 180 or 187 files”, All My Papers, 2009, 36 pgs.
Hildebrand, C. et al., “Electronic Money,” Oracle, http://www.oracle.com/oramag/profit/05-feb/p15financial.html, 2005, downloaded Oct. 18, 2005 (5 pgs).
Hill, “From J-Phone to Lumina 1020: A complete history of the camera phone”, Digital Trends, 2020, 9 pgs.
Hill, Simon, “From J-Phone to Lumina 1020: A Complete History of the Camera Phone”, dated Aug. 11, 2013, 19 pgs.
Hillebrand, G., “Questions and Answers About the Check Clearing for the 21st Century Act, 'Check 21,” ConsumersUnion.org, http://www.consumersunion.org/finance/ckclear1002.htm, Jul. 27, 2004, downloaded Oct. 18, 2006 (6 pgs).
Hoffman, J., “Before there Were Smartphones, There was I-Mode”, 1999, 5 pgs.
HTC Touch Diamond Manual, copyright 2008, (attached as Exhibit 11 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 257 pgs.
Humphrey, David B. and Hunt, Robert, “Getting Rid of Paper: Savings From Check 21”, Working Paper No. 12-12, Research Department, Federal Reserve Bank of Philadelphia, (May 2012), available on the Internet at: https://philadelphiafed.org/-/media/research-and-data/publications/working-papers/2012/wp12-12.pdf, (attached as Exhibit 14 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 29 pgs.
Iida, Jeanne, “The Back Office: Systems—Image Processing Rolls on as Banks ReapBenefits,” American Banker, Jul. 19, 1993, printed from the Internet at <http://search.proquest.com/docview/292903245/14138420743684F7722/14?accountid=14 . . . >, on Oct. 19, 2013 (3 pgs).
Image Master, “Photo Restoration: We specialize in digital photo restoration and photograph repair of family pictures”, http://www.imphotorepair.com, cited in U.S. Pat. No. 7,900,822, as downloaded Apr. 2007 (1 pg).
Immich et al., Performance Analysis of Five Interprocess CommunicAtion Mechanisms Across UNIX Operating Systems, 68 J. Syss. & Sofware 27 (2003).
Instrument—Definition from the Merriam-Webster Online Dictionary, dated Mar. 2, 2019, 1 pg.
Instrument—Definition of Instrument from the Oxford Dictionaries (British & World English), dated Jul. 2, 2017, 44 pgs.
Investment Systems Company, “Portfolio Accounting System,” 2000, 34 pgs.
iPhone App Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: iPhone Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: https://wwwapple.com/newsroom/2008/07/14iPhone-App-Store-Downloads-Top-10-Million-in-First-Weekend, 4 pgs.
iPhone Application Programming Guide Device Support, dated Apr. 26, 2009, 7 pgs.
IPR2022-00049 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 70 pages.
IPR2022-00050 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 126 pages.
IPR2022-00075 filed Nov. 5, 2021 on behalf of PNC Bank N.A., 90 pages.
IPR2022-00076 filed Nov. 17, 2021 on behalf of PNC Bank N.A., 98 pages.
IPR2019-00815 U.S. Pat. No. 9,818,090, Decision Denying Institution of Inter Parties Review, dated Aug. 26, 2019, 28 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Declaration of Peter Alexander, PhD, as filed in the IPR on Mar. 20, 2019, 99 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Exhibit B Proposed Claim Constructions for the '571, '090, '779 and '517 Patents, filed on Feb. 28, 2019, 10 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Notice of Filing Date Accorded to Petition and Time for Filing Patent Pwner Preliminary Response, dated Mar. 27, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petition For Inter Parties Review of Claims 1-19 of U.S. Pat. No. 9,818,090, dated Mar. 20, 2019, 56 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant To Authorization Provided In Paper No. 13, dated Aug. 1, 2019, 9 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Supplement Exhibit List, dated Aug. 1, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, United Services Automobile Association (“USAA”)'s Sur-Reply In Support of Patent Owner Preliminary Response, dated Aug. 8, 2019, 8 pgs.
IPR2019-00815, Declaration of Matthew A. Calman In Support of Patent Owner Preliminary Response, dated Jun. 27, 2019, 25 pgs.
IPR2019-00815, Federal Reserve Financial Services Retired: DSTU X9.37-2003, Specifications for Electronic Exchange of Check and Image Data, Copyright 2006 by Accredited Standards Committee X9, Inc., dated Mar. 31, 2003, 157 pgs.
IPR2019-00815, Invalidity Chart, uploaded on Jun. 27, 2019, 94 pgs.
IPR2019-00815, Supplementary Invalidity Chart, dated on Jun. 27, 2019, 16 pgs.
IPR2019-00815, United Services Automobile Association (“USAA”)'s Patent Owner Preliminary Response, dated Jun. 27, 2019, 66 pgs.
IPR2019-01081—exhibit USAA Ex 2044-p 1 Wells Fargo v USAA, titled “Mitek Introduces ImageNet Mobile Deposit”, copyright 2007 by Mitek Systems, 4 pgs.
IPR2019-01081—U.S. Pat. No. 9,336,517 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314; 37 C.F.R. § 42.4, dated Jan. 13, 2020, 60 pgs.
IPR2019-01081 U.S. Pat. No. 9,336,517, Petition For Inter Parties Review of Claims 1, 5-10, 12-14, 17-20 of U.S. Pat. No. 9,336,517, dated Jun. 5, 2019, 78 pgs.
IPR2019-01081, Declaration of Peter Alexander, Ph.D, dated Jun. 5, 2019, 135 pgs.
IPR2019-01082—U.S. 8,977,571 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314; 37 C.F.R. § 42.4, dated Dec. 13, 2019, 56 pgs.
IPR2019-01082 U.S. Pat. No. 8,977,571, Petition for Inter Parties Review of Claims 1-13 U.S. Pat. No. 8,977,571, dated Jun. 5, 2019, 75 pgs.
IPR2019-01082—U.S. Pat. No. 8,977,571 B1, Decision Granting Institution of Inter Partes Review 35 U.S.C. § 314, dated Jan. 9, 2020, 58 pgs.
IPR2019-01083 U.S. Pat. No. 8,699,779, Petition for Inter Parties Review of Claims 1-18 U.S. Pat. No. 8,699,779, dated Jun. 5, 2019, 74 pgs.
IPR2020-00091, U.S. Pat. No. 9,177,198, Petition for Inter Parties Review of Claims 1-3 and 5-20 U.S. Pat. No. 9,177,198, dated Nov. 7, 2019, 72 pgs.
IPR2020-00882—Mitek Systems, Inc. v. United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 9,818,090, dated Apr. 30, 2020, 102 ps.
IPR2020-00976—Mitek Systems, Inc. v. United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 8,699,779, dated May 22, 2020, 87 pgs.
IPR2020-01101—Mitek Systems, Inc. v. United Services Automobile Association, Petition for Inter Partes Review of U.S. Pat. No. 9,336,517, dated Jun. 12, 2020, 91 pgs.
JBC, “What is a MICR Line?,” eHow.com, retrieved from http://www.ehow.com/about_4684793_what-micr-line.html on May 4, 2009 (2 pgs).
Johnson, Jennifer J., Secretary of the Board; Federal Reserve System, 12 CFR Part 229, Regulation CC; Docket No. R 1176, “Availability of Funds and Collection of Checks”. cited in U.S. Pat. No. 7,900,822, as dated 2009, (89 pgs).
Joinson et al., Olympus E-30 Review (pts. 1,4,8), Digital Photography Review, Mar. 24, 2009, located on the internet at: www.dpreview.com/reviews/olympus30, 26 pgs.
Jung et al, “Rectangle Detection based on a Windowed Hough Transform”, IEEE Xplore, 2004, 8 pgs.
Kendrick, Kevin B., “Check Kiting, Float for Purposes of Profit,” Bank Security & Fraud Prevention, vol. 1, No. 2, 1994 (3 pgs).
Kiser, Elizabeth K.; “Modeling the Whole Firm: The Effect of Multiple Inputs and Financial Intermediation on Bank Deposit Rates;” FEDS Working Paper No. 2004-07; Jun. 3, 2003; pp. 1-46 (46 pgs).
Klein, Robert, Financial Services Technology, “Image Quality and Usability Assurance: Phase 1 Project”, dated Jul. 23, 2004, 67 pgs.
Knerr et al., The A2iA Intercheque System: Courtesy Amount and Legal Amount Recognition for French Checks in Automated Bancheck Processing 43-86, Impedove et al. eds., 1997, 50 pgs.
Knestout, Brian P. et al., “Banking Made Easy” Kiplinger's Personal Finance Washington, Jul. 2003, vol. 57, Iss 7 (5 pgs).
Koga, M. et al., Camera-based Kanji OCR for Mobile-phones: Peactical Issues, IEEE, 2005, 5 pgs.
Kornal Andras et al., “Recognition of Cursive Writing on Personal Checks”, Proceedings of International Workshop on the Frontiers in Handwriting Recognition, cited in U.S. Pat. No. 7,900,822, as dated Sep. 1996, (6 pgs).
Lacker, Jeffrey M., “Payment System Disruptions and the Federal Reserve Following Sep. 11, 2001”, The Federal Reserve Bank of Richmond, (Dec. 23, 2003) (attached as Exhibit 19 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 55 pgs.
Laine, M. et al., “A Standalone OCR System For Mobile Cameraphones”, IEEE, 2006, 5 pgs.
Lampert, Christoph et al., “Oblivious Document Capture and Real-Time Retrieval,” International Workshop on Camera Based Document Analysis and Recognition (CBDAR), 2005 (8 pgs). (Retrieved from: http://www-cs.ccny.cuny.edu/˜wolberg/capstone/bookwarp/LampertCBDAR05.pdf).
Lange, Bill, “Combining Remote Capture and IRD Printing, A Check 21 Strategy For Community and Regional Banks”, dated 2005, 25 pgs.
Leach, et al., A Universally Unique Identifier (UUID) URN Namespace, (Jul. 2005) retrieved from https://www.ietf.org/rfc/rfc4122.txt.
Lee, Jeanne, “Mobile Check Deposits: Pro Tips to Ensure They Go Smoothly”, dated Feb. 19, 2016, 6 pgs.
Leica Digilux 2 Instructions located on the Internet: http://www.overgaard.dk/pdf/d2_manual.pdf (attached as Exhibit 2 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 95 pgs.
Levitin, Adam J., Remote Deposit Capture: A Legal and Transactional Overview, Banking Law Journal, p. 115, 2009 (RDC), 8 pgs.
Liang, J et al., “Camera-based analysis of text and documents: a survey”, IJDAR, vol. 7, 2005, pp. 84-104, 21, pgs.
Liang, Jian et al., Camera-Based Analysis of Text and Documents: A Survey, International Journal on Document Analysis and Recognition, Jun. 21, 2005, 21 pages.
Luo, Xi-Peng et al., “Design and Implementation of a Card Reader Based on Build-In Camera,” Proceedings of the 17th International Conference on Pattern Recognition, 2004, 4 pages.
MacKenzie, E., Photography Made Easy, copyright 1845, 80 pgs.
Magid, L., “A baby girl and the camera phone were born 20 years ago”, Mercury News, 2017, 3 pgs.
Masonson, L., “Check Truncation and ACH Trends—Automated Clearing Houses”, healthcare financial management associate, http://www.findarticles.com/p/articles/mLm3276/is_n7_v47/ai_14466034/prin t, 1993 (2 pgs).
Matthews, Deborah, “Advanced Technology Makes Remote Deposit Capture Less Risky,” Indiana Bankers Association, Apr. 2008 (2 pgs).
Meara, Bob, “State of Remote Deposit Capture 2015 Mobile Is the New Scanner”, dated May 2015, 56 pgs.
Meara, Bob, “USAA's Mobile Remote Deposit Capture”, Dated Jun. 26, 2009, 2 pgs.
Metro 1 Credit Union, “Remote Banking Services,” hltp://ww\\i.metro1cu.org/metro1cu/remote.html, downloaded Apr. 17, 2007 (4 pgs).
Microsoft Mobile Devices Buyer's Guide, 2012, 4 pgs.
Microsoft Mobile Devices Smartphone, 2003, 2 pgs.
Mirmehdi, M. et al., “Extracting Low Resolution Text with an Active Camera for OCR”, in Proccedings of the IX Spanish Symposium on Pattern Recognition and Image Processing (pp. 43-48), 2001, 6pgs.
Mirmehdi, M. et al., “Towards Optimal Zoom for Automatic Target Recognition”, in Proceedings of the Scandinavian Conference on Image Analysis, 1:447-454, 1997, 7 pgs.
Mitek Systems, “Imagenet Mobile Deposit”, San Diego, CA, downloaded 2009 (2 pgs).
Mitek Systems: Mitek Systems Launches First Mobile Check Deposit and Bill Pay Application, San Diego, CA, Jan. 22, 2008 (3 pgs).
Mitek Video titled “Mobile Deposit Tour”, Published on Jul. 2, 2009 by Mitek Systems, duration 2 minutes and 13 seconds, located on the Internet at: https://www.youtube.com/watch?v=sGD49ybxS2Q, 25 pgs.
Mitek, “Video Release—Mitek MiSnap ™ Mobile Auto Capture Improves Mobile Deposit® User Experience at Ten Leading Financial Institutions”, dated Jul. 15, 2014, 2 pgs.
Mitek's Mobile Deposit Processes More Than Two Billion Checks, $1.5 Trillion in Cumulative Check Value, dated Mar. 8, 2018, 2 pgs.
Mohl, Bruce, “Banks Reimbursing ATM Fee to Compete With Larger Rivals”, Boston Globe, Boston, MA, Sep. 19, 2004 (3 pgs).
Moreau, T., “Payment by Authenticated Facsimile Transmission: a Check Replacement Technology for Small and Medium Enterprises,” Connotech Experts-conseils, Inc., Apr. 1995 (31 pgs).
Motomanual for Motorazr, located on the Internet at: https://www.cellphones.ca/downloads/phones/manuals/motorola-razr-v3xx- manual.pdf (excerpts attached as Exhibit 8 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 34 pgs.
Motomanual, MOTOROKR-E6-GSM-English for wireless phone, copyright 2006, 144 pgs.
Motorola RAZR MAXX V6 User Manual, located on the Internet at: https://www.phonearena.com/phones/Motorola-RAZR-MAXX-V6_id1680, (attached as Exhibit 7 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 36 pgs.
N. Ritter & M. Ruth, The Geo Tiff Data Interchange Standard for Raster Geographic Images, 18 Int. J. Remote Sensing 1637 (1997).
NCR, Mobile Remote Deposit Capture (RDC), copyright 2011, 8 pgs.
Nelson, B. et al., “Remote deposit capture changes the retail landscape,” Northwestern Financial Review, http://findarticles.com/p/articles/miqa3799/is 200607/ai_n16537250, 2006 (3 pgs).
NetBank, Inc., “Branch Out: Annual Report 2004,” 2004 (150 pgs).
NetBank, Inc., “Quick Post: Deposit and Payment Forwarding Service,” 2005 (1 pg).
NetDeposit Awarded Two Patents for Electronic Check Process, NetDeposit, Jun. 18, 2007, (1 pg).
Nikon Digital Camera D300 User's Manual, located on the Internet at: http://download.nikonimglib.com/archive2/iBuJv00Aj97i01y8BrK49XX0Ts69/ D300,EU(En)04.pdf (attached as Exhibit 5 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 195 pgs.
Nixon, Julie et al., “Fiserv Research Finds Banks are Interested in Offering Mobile Deposit Capture as an,” Fiserv, Inc. Brookfield, Wis., (Business Wire), (Feb. 20, 2009), 2 pgs.
Nokia N90 Phone Features, 2005, 4 pgs.
Nokia N90 Review Digital Trends, dated Feb. 11, 2019, obtained from the Internet at: https://www.digitaltrends.com/cell-phone-reviews/nokia-n90- review/, 11 pgs.
Nokia N90 User Guide, 2005, 132 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 1 of 3, 67 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 2 of 3, 60 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 3 of 3, 53 pgs.
Nokia N95 8GB User Guide, copyright 2009, located on the Internet at: https://www.nokia.com/en_int/phones/sites/default/files/user-guides/Nokia_N95_8GB_Extended_UG_en.pdf (excerpts attached as Exhibit 9 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 77 pgs.
Online Deposit: Frequently Asked Questions, http://www.depositnow.com/faq.html, Copyright 2008 (1 pg).
Onlinecheck.com/Merchant Advisors, “Real-Time Check Debit”, Merchant Advisors: Retail Check Processing Check Conversion, http://www.onlinecheck/wach/rcareal.htm, cited in U.S. Pat. No. 7,900,822, as dated 2006 (3 pgs).
Oxley, Michael G., from committee on Financial Services: “Check Clearing For the 21st Century Act, 108th Congress, 1st Session House of Representatives report 108-132”, Jun. 2003 (20 pgs).
Oxley, Michael G., from committee of conference: “Check Clearing For the 21st Century Act” 108th Congress, 1st Session Senate report 108-291, Oct. 1, 2003 (20 pgs).
P.R. 4-3 Joint Claim Construction and Pre-Hearing Statment, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 5, 2019, 190 pgs.
Palacios, Rafael et al., “Automatic Processing of Brazilian Bank Checks”, Cited in U.S. Pat. No. 7,900,822, as dated 2002 (28 pgs).
Panasonic Operating Instructions for Digital Camera/Lens Kit Model No. DMC-L1K, https://www.panasonic.com/content/dam/Panasonic/support_manual/Digital_Still_Camera/English_01-vqt0-vqt2/vqt0w95_L1_oi.pdf (attached as Exhibit 4 from the Defendant Wells Fargo Back N.A.'s Answer dated Aug. 14, 2018), 129 pgs.
Panini My Vision X Operator Manual, Panini, 2004, (cited in IPR2020-00093, U.S. Pat. No. 9,892,454), 51 pgs.
Pappas, A., “Taking Sharper Pictures Is Now a Snap as Sprint Launches First 1.3-Megapixal Camera Phone in the United States”, 2004, 2 pgs.
Parikh, T., “Mobile Phones and Paper Documents: Evaluating a New Approach for Capturing Microfinance Data in Rural India”, CHI 2006 Proceedings, 2006, 10 pgs.
Parikh, T., “Using Mobile Phones for Secure, Distributed Document Processing in the Developing World”, IEE Persuasive Computing, vol. 4, No. 2, 2009, 9 pgs.
Parties'P.R.4-5(D) Joint Claim Construciton Chart, filed in Civil Action No. 2:18-CV-245, dated Jun. 14, 2019, 28 pgs.
Parties'P.R.4-5(D) Joint Claim Construciton Chart, filed in Civil Action No. 2:18-CV-366, dated Jun. 18, 2019, 27 pgs.
Parties'P.R.4-5(D) Joint Claim Construciton Chart, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2;18-cv-245, dated May 9, 2019, 25 pgs.
Patent Dislaimer for U.S. Pat. No. 8,699,779, filed on Mar. 4, 2019, 2 pgs.
Patent Dislaimer for U.S. Pat. No. 8,977,571, filed on Feb. 20, 2019, 2 pgs.
Patent Dislaimer for U.S. Pat. No. 9,336,517, filed on Mar. 4, 2019, 2 pgs.
Patent Dislaimer for U.S. Pat. No. 9,818,090, filed on Feb. 20, 2019, 2 pgs.
Patterson, Scott “USAA Deposit@Home—Another WOW moment for Net Banking”, NextCU.com, Jan. 26, 2007 (5 pgs).
Pbmplus—image file format conversion package, retrieved from https://web.archive.org/web/20040202224728/https:/www.acme.com/software/pbmplus/.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, dated Jul. 21, 2021, IPR2021-01076, 111 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, dated Jul. 21, 2021, IPR2021-01077, 100 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01071, 106 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-30 of U.S. Pat. No. 10,013,681, filed Aug. 27, 2021, IPR2021-01381, 127 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-7, 10-21 and 23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01074.
Petition filed by PNC Bank N.A. for Inter Partes Review of U.S. Pat. No. 10,013,605, filed Aug. 27, 2021, IPR2021-01399, 113 pages.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,336,517, dated Nov. 8, 2018, 98 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,818,090, dated Nov. 8, 2018, 90 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 8,977,571, dated Nov. 8, 2018, 95 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-23 of U.S. Pat. No. 8,699,779, dated Nov. 8, 2018, 101 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiff's Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Mar. 21, 2019, 36 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiff's Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 26, 2019, 18 pgs.
Plaintiff's Notice of Decision Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated May 15, 2019, 36 pgs.
Plaintiff's Notice of Decisions Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated Jun. 6, 2019, 61 pgs.
Plaintiff's Notice of Filing Claim Construction Presentation, filled in Civil Action No. 2:18-CV-245, dated May 23, 2019, 106 pgs.
PNC Bank to Offer Ease of Online Deposit Service Integrated with QuickBooks to Small Business, RemoteDepositCapture.com, Jul. 24, 2006, 2 pgs.
POP, ARC and BOC—A Comparison, Federal Reserve Banks, at 1(Jan. 7, 2009), available on the Internet at: https://web.archive.org/web/20090107101808/https://www.frbservices.org/fil es/eventseducation/ pdf/pop_arc_boc_comparison.pdf (attached as Exhibit 13 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 3 pgs.
Provisional patent application filed by Wells Fargo Bank, dated Jan. 29, 2008, 134 pgs.
Public Law 108-100, 108 Congress; “An Act Check Clearing For the 21st Century Act”, Oct. 28, 2003, 117 STAT. 1177 (18 pgs).
Quinn and Roberds, The Evolution of the Check as a Means of Payment: A Historical Survey, Federal Reserve Bank of Atlanta, Economic Review, 2008, 30 pgs.
Rao, Bharat; “The Internet And The Revolution in Distribution: A Cross—Industry Examination”; Technology in Society; 1999; pp. 287-306; vol. 21, No. 3 (20 pgs).
Readdle, Why Scanner Pro is Way Better Than Your Camera? (Jun. 27, 2016) retrieved from https://readdle.com/blog/why-scanner-pro-is-way- better-than-your-camera.
Remote Deposit Capture Basic Requirements, dated Aug. 22, 2009, 1 pg.
Remote Deposit Capture.com Scanner Matrix, dated Oct. 21, 2011, 3 pgs.
Remotedepositcapture, URL:www.remotedepositcapture.com, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (5 pgs).
RemoteDepositCapture.com, Remote Deposit Capture News Articles from Jul. 6, 2006, “BankServ Announces New Remote Deposit Product Integrated with QuickBooks” (3 pgs).
Remotedepsitcapture.com, LLC, “Remote Deposit Capture Overview,” ROC Overview, http://remotedepositcapture.com/overview/RDC_overview.htm, Cited in Pat. No. 7,900,822, as dated Mar. 12, 2007 (4 pgs).
Richey, J. C. et al., “EE 4530 Check Imaging,” Nov. 18, 2008 (10 pgs).
Ritzer, J.R. “Hinky Dinky helped spearhead POS, remote banking movement”, Bank Systems and Equipment, vol. 21, No. 12, Dec. 1984 (1 pg).
Rivlin, Alice M. et al., Chair, Vice Chair—Board of Governors, Committee on the Federal Reserve in the Payments Mechanism—Federal Reserve System, “ The Federal Reserve in the Payments Mechanism”, Jan. 1998 (41 pgs).
Rockwell, The Megapixel Myth, KenRickwell.com, 2008, located on the Internet at: http://kewrockwell.com.tech/mpmyth.htm, 6 pgs.
Rohs, M. et al., “A Conceptual Framework for Camera Phone-based Interaction Techniques”, Swiss Federal Institute of Technology, Zurich, Switzerland, 19 pgs.
Rose, Sarah et al., “Best of the We: The Top 50 Financial Websites”, Money, New York, Dec. 1999, vol. 28, Iss. 12 (8 pgs).
Rowles, Tony, USAA-v. Wells Fargo No. 2:16-cv-245-JRGL e-mail correspondence dated Jan. 24, 2019, 2 pgs.
Santomero, The Evolution of Payments in the U.S.: Paper vs. Electronic (2005) retrieved from https://web.archive.org/web/20051210185509/https://www.philadelphiafed.org/publicaffairs/speeches/2005_santomero9.html.
SCH0i910 Portable Dualmode Smartphone User Guide by Samsung, Copyright 2009 Samsung Electronics Canada, downloadable from www.manualslib.com, 168 pgs.
Schindler, Scanner Pro Review (Dec. 27, 2016) retrieved from https://www.pcmag.com/reviews/scAnner-pro.
Sechrest, Stuart et al., “Windows XP Performance”, Microsoft, dated Jun. 1, 2001, 20 pgs.
Shah, Moore's Law, Continuous Everywhere But Differentiable Nowhere, Feb. 12, 2009, located on the Internet at: http://samjshah.com/2009/02/24/morres-law/, 5 pgs.
Shelby, Hon. Richard C. (Committee on Banking, Housing and Urban Affairs); “Check Truncation Act of 2003”, calendar No. 168, 108th Congress, 1st Session Senate report 108-79, Jun. 2003 (27 pgs).
Sing Li & Jonathan Knudsen, Beginning J2ME: From Novice to Professional, Third Edition (2005), ISBN (pbk): 1-59059-479-7, 468 pages.
Sony Digital Camera User's Guide/ Trouble Shooting Operating Instrucitons, copyright 2005, located on the Internet at: https://www.sony.co.uk/electronics/support/res/manuals/2654/26544941M.pdf (attached as Exhibit 3 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 136 pgs.
Sony Ericsson K800i—Product Overview, 2006, 2 pgs.
Sony Ericsson K800i, User Manual, Part 1, 2006, 98 pgs.
SoyBank Anywhere, “Consumer Internet Banking Service Agreement,” Dec. 6, 2004 (6 pgs).
Spencer, Harvey, “White Paper Check 21 Controlling Image Quality At The Point of Capture”, dated 2004, 7 pgs.
Sprint PCS Vision Picture Phone, PM-8920 by Audiovox, User's Manual, Part 1, 2004, 103 pgs.
Stellin, Susan, “Bank Will Allow Customers to Direct Deposit by iPhone”, the New York Times article dated Aug. 9, 2009, obtained from the Internet at: www.nytimes.com/2009/08/10/technology/10check.html, 3 pgs.
Sumits, Major Mobile Milestones—The Last 15 Years, and the Next Five, Cisco Blogs, Feb. 3, 2016, located on the Internet at: https://blogs.cisco.com/sp/mobile-vni-major-mobile-milesrones-the-last15-years-and-the-next-five, 12 pgs.
Teixeira, D., “Comment: Time to Overhaul Deposit Processing Systems,” American Banker, Dec. 10, 1998, vol. 163, No. 235, p. 15 (3 pgs).
Thailandguru.com: How and where to Pay Bills @ www.thailandguru.com/paying-bills.html, © 1999-2007 (2 pgs).
The Automated Clearinghouse, “Retail Payment Systems; Payment Instruments Clearing and Settlement: The Automated Clearinghouse (ACH)”, www.ffiec.gov/ffiecinfobase/booklets/retailretail_02d.html, Cited in U.S. Pat. No. 7,900,822, as dated Dec. 2005 (3 pgs).
The Green Sheet 2.0: Newswire, “CO-OP adds home deposit capabilities to suite of check imaging products”, www.greensheet.com/newswire.php?newswire_id=8799, Mar. 5, 2008 (2 pgs).
Timothy R. Crews list of Patents, printed from the United States Patent and Trademark Office on Feb. 13, 2019, 7 pgs.
Tochip, E. et al., “Camera Phone Color Appearance Utility—Finding a Way to Identify Camera Phone Picture Color”, 25 pages, 2007.
Tygar, J.D., Atomicity in Electronic Commerce, In ACM Networker, 2:2, Apr./May 1998 (12 pgs).
U.S. Appl. No. 61/022,279, dated Jan. 18, 2008, (cited in IPR2020-00090, U.S. Pat. No. 9,177,197), 35 pgs.
USAA's Objections to Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 6 pgs.
USAA's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 10 pgs.
USAA's Opening Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 17, 2019, 32 pgs.
USAA's Opening Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Apr. 11, 2019, 32 pgs.
USAA's Reply Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 2, 2019, 227 pgs.
USAA's Reply to Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Case No. 2:18-cv-245, dated May 2, 2019, 15 pgs.
USAA's Reply Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated Jun. 7, 2019, 14 pgs.
Valentine, Lisa, “Remote Deposit Capture Hot Just Got Hotter,” ABA Banking Journal, Mar. 2006, p. 1-9.
Van Dyke, Jim, “2017 Mitek Mobile Deposit Benchmark Report”, copyright 2017, 25 pgs.
Vaream, Craig, “Image Deposit Solutions: Emerging Solutions for More Efficient Check Processing,” JP Morgan Chase, Nov. 2005 (16 pgs).
Wade, Will, “Early Debate on Remote-Capture Risk,” American Banker, New York, NY, May 26, 2004 (3 pgs).
Wade, Will, “Early Notes: Updating Consumers on Check 21” American Banker Aug. 10, 2004 (3 pgs).
Wallison, Peter J., “Wal-Mart Case Exposes Flaws in Banking-Commerce Split”, American Banker, vol. 167. No. 8, Jan. 11, 2002 (3 pgs).
Wang, Ching-Lin et al. “Chinese document image retrieval system based on proportion of black pixel area in a character image”, the 6th International Conference on Advanced Communication Technology, 2004, vol. 1, IEEE, 2004.
Wausau Financial Systems, Understanding Image Quality & Usability Within a New Environment, 2006, 22 pgs.
Wausau, “Understanding Image Quality & Usability Within a New Environment”, copyright 2019, 1 pg.
Wells Fargo 2005 News Releases, “The New Wells Fargo Electronic Deposit Services Break Through Banking Boundaries In The Age of Check 21”, San Francisco Mar. 28, 2005, www.wellsfargo.com/press/3282005_check21Year=2005 (1 pg).
Wells Fargo Commercial, “Remote Deposit”, www.wellsfargo.com/com/treasury mgmtlreceivables/electronic/remotedeposit, Copyright 2008 (1 pg).
Wells Fargo's Objections to Magistrate Judge Payne's Claim Construction Memorandum and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 7 pgs.
Wells Fargo's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 7 pgs.
White, J.M. et al., “Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction”, IBM J. Res. Development, Jul. 1983, vol. 27, No. 4 (12 pgs).
Whitney, Steve et al., “A Framework For Exchanging Image Returns”, dated Jul. 2001, 129 pgs.
Wikipedia ®, “Remote Deposit,” http://en.wikipedia.org/wiki/Remote_deposit, 2007 (3 pgs).
Windowsfordevices.com, “Software lets camera phone users deposit checks, pay bills”, www.windowsfordevices.com/news/NS3934956670.html, Jan. 29, 2008 (3 pgs).
Wolfe, Daniel, “Check Image Group Outlines Agenda,” American Banker, New York, N.Y.: Feb. 13, 2009, vol. 174, Iss. 30, p. 12. (2 pgs).
Woody Baird Associated Press, “Pastor's Wife got Scammed—She Apparently Fell for Overseas Money Scheme,” The Commercial Appeal, Jul. 1, 2006, p. A. 1.
X9.100-180, “The New ICL Standard is Published”, All My Papers, 2006, 3 pgs.
X9.37 Specifications | X9Ware LLC, dated 2018, 3 pgs.
Yeo, L.H. et al., “Submission of transaction from mobile workstations in a cooperative multidatabase environment”, IEEE, 1994, (cited in IPR2020-00097, U.S. Pat. No. 7,885,880), 10 pgs.
Zandifar, A. et al., “A Video Based Interface To Textual Information For The Visually Impaired”, IEEE 17th International Symposium on Personal, Indoor and Mobile Radio Communications, 1-5, 2002, 6 pgs.
Zandifar, A., “A Video-Based Framework for the Analysis of Presentations/Posters,” International Journal on Document Analysis and Recognition, Feb. 2, 2005, 10 pages.
Zaw, Kyi Pyar and Zin Mar Kyu, “Character Extraction and Recognition for Myanmar Script Signboard Images using Block based Pixel Count and Chain Codes” 2018 IEEE/ACIS 17th International Conference on Computer and Information Science (CS), IEEE, 2018.
Zhang, C.Y., “Robust Estimation and Image Combining” Astronomical Data Analysis Software and Systems IV, ASP Conference Series, 1995 (5 pgs).
Zions Bancorporation, “Moneytech, the technology of money in our world: Remote Deposit,” http://www.bankjunior.com/pground/moneytech/remote_deposit.jsp, 2007 (2 pgs).
Bruno-Britz, Maria “Mitek Launches Mobile Phone Check Capture Solution,” Bank Systems and Technologies Information Week (Jan. 24, 2008).
V User Guide, https://www.lg.com/us/support/manualsdocuments?customerModelCode=% 20LGVX9800&csSalesCode=LGVX9800, select“VERISON(USA) en”; The V_UG_051125.pdf.
MING Phone User Manual, 2006.
Patel, Kunur, “How Mobile Technology is Changing Banking's Future” AdAge, Sep. 21, 2009, 4 pages.
Spencer, Harvey, “Controlling Image Quality at the Point of Capture” Check 21, Digital Check Corporation & HSA 2004.
Moseik, Celeste K., “Customer Adoption of Online Restaurant Services: A Multi-Channel Approach”, Order No. 1444649 University of Delaware, 2007, Ann Arbor: ProQuest., Web. Jan. 10, 2022 (Year: 2007).
Bieniecki, Wojciech et al. “Image Preprocessing for Improving OCR Accuracy”, Computer Engineering Department, Technical University of Lodz, al. Politechniki 11, Lodz Poland.
Shaikh, Aijaz Ahmed et al., “Auto Teller Machine (ATM) Fraud—Case Study of Commercial Bank in Pakistan”, Department of Business Administration, Sukkur Institute of Business Administration, Sukkur, Pakistan.
Tiwari, Rajnish et al., “Mobile Banking as Business Strategy”, IEEE Xplore, Jul. 2006.
Lyn C. Thomas, “A survey of credit and behavioural scoring: forecasting financial risk of lending to consumers”, International Journal of Forecasting, (Risk) (2000).
Non-Final Office Action issued on U.S. Appl. No. 14/293,159 dated Aug. 11, 2022.
Non-Final Office Action issued on U.S. Appl. No. 16/455,024 dated Sep. 7, 2022.
Non-Final Office Action issued on U.S. Appl. No. 17/071,678 dated Sep. 14, 2022.
Non-Final Office Action issued on U.S. Appl. No. 17/180,075 dated Oct. 4, 2022.
Non-Final Office Action issue on U.S. Appl. No. 17/511,822 dated Sep. 16, 2022.
Non-Final Office Action issued on U.S. Appl. No. 17/568,849 dated Oct. 4, 2022.
Yong Gu Ji et al., “A Usability Checklist for the Usability Evaluation of Mobile Phone User Interface”, International Journal of Human-Computer Interaction, 20(3), 207-231 (2006).
Printout of news article dated Feb. 13, 2008, announcing a Nokia phone providing audio cues for capturing a document image.
IPR Petition 2022-01593, Truist Bank v. United Services Automobile Association filed Oct. 11, 2022.
Final Written Decision, IPR2021-01070, dated Jan. 19, 2023.
Final Written Decision, IPR2021-01073, dated Jan. 19, 2023.
Publication of U.S. Appl. No. 60/727,533, filed Oct. 17, 2005, related to U.S. Appl. No. 60/727,533, filed Oct. 17, 2005.
Ans X9.100-160-1-2004, Part 1: Placement and Location of Magnetic Ink Printing (MICR), American National Standard for Financial Services, approved Oct. 15, 2004.
Clancy, Heather, “Turning cellphones into scanners”, The New York Times, Feb. 12, 2005: https://www.nytimes.com/2005/02/12/busienss/worldbusiness/turning-cellphones-into-scanners.html.
Consumer Guide to Check 21 and Substitute Checks, The Federal Reserve Board, The Wayback Machine—Oct. 28, 2004: https://web.archive.org/web/20041102233724/http://www.federalreserve.gov.
Curtin, Denis P., A Short Course in Digital Photography Chapter 7 Graphic File Formats.
Dance, Christopher, “Mobile Document Imaging”, Xerox, Research Centre Europe, XRCE Image Processing Area, Nov. 2004.
Digital Photography Now, Nokia N73 Review, Oct. 28, 2006.
Federal Reserve System, 12 CFR Part 229, Regulation CC: Docket No. R-1176, Availability of Funds and Collection of Checks, Board of Governors of the Federal Reserve System Final rule.
Financial Services Policy Committee, Federal Reserve Banks Plan Black-and-White Image Standard and Quality Check, May 18, 2004.
MICR-Line Issues Associated With The Check 21 Act and the Board's Proposed Rule, Prepared by Federal Reserve Board Staff, Apr. 27, 2004.
Microsoft Computer Dictionary Fifth Edition—Copyright 2002.
HTTP Over TLS, Network Working Group, May 2000, Memo.
Nokia N73—Full phone specifications.
Ranjan, Amit, “Using a Camera with Windows Mobile 5”, Jul. 21, 2006.
Reed, John, “FT.com site: Mobile users branch out”, ProQuest, Trade Journal, Oct. 6, 2005.
Weiqui Luo et al., “Robust Detection of Region-Duplication Forgery in Digital Image” Guoping Qui, School of Computer Science, University of Nottingham, NG8 1BB, UK—Jan. 2006.
Final Written Decision relating to U.S. Pat. No. 8,699,779, IPR2021-01070, dated Jan. 19, 2023.
Final Written Decision relating to U.S. Pat. No. 8,877,571, IPR2021-01073, dated Jan. 19, 2023.
Final Written Decision relating to U.S. Pat. No. 10,621,559, IPR2021-01077, dated Jan. 20, 2023.
IPR2023-00829 filed Apr. 13, 2023, Truist Bank vs. United Services Autombile Association, 97 pgs.
“Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services,” Fed. Reserve Bank of Minneapolis Fin. Serv. Policy Comm., May 18, 2004.
“Camera Phone Shoot-Out”, Phone Scoop, Dec. 18, 2002.
Shirai, K. et al., “Removal of Background Patterns and Signatures for Magnetic Ink Character Recognition of Checks,” 2012 10th IAPR International Workshop on Document Analysis Systems, Gold Coast, QLD. Australia, 2012, pp. 190-194.
Ding, Y. et al., “Background removal for check processing using morphology in Two-and Three-Dimensional Vision Systems for Inspection, Control, and Metrology”. vol. 5606, pp. 19-26, SPIE 2004.
Haskell, B.G. et al, “Image and video coding-emerging standards and beyond,” in IEEE Transactions on Circuits and Systems for Video Technology, vol. 8 No. 7, pp. 814-837, Nov. 1998.