Character count determination for a digital image

Information

  • Patent Grant
  • 11281903
  • Patent Number
    11,281,903
  • Date Filed
    Monday, December 9, 2019
    4 years ago
  • Date Issued
    Tuesday, March 22, 2022
    2 years ago
Abstract
An image processing system or electronic device may implement processing circuitry. The processing circuitry may receive an image, such as financial document image. The processing circuitry may determine a character count for the financial document image or particular portions of the financial document image without recognizing any particular character in the financial document image. In that regard, the processing circuitry may determine a top left score for pixels in the financial document, the top left score indicating or representing a likelihood that a particular pixel corresponds to a top left corner of a text character. The processing circuitry may also determine top right score for image pixels. Then, the processing circuitry may identify one or more text chunks using the top left and top rights scores for pixels in the financial document image. The processing circuitry may determine a character count for the identified text chunks.
Description
BACKGROUND
1. Technical Field

This disclosure relates to determining a character count for an image, such as a financial document image. This disclosure also relates to determining the character count of the image without recognizing the identity of the characters in the image.


2. Description of Related Art

Systems may receive digital images for processing. As one example, an electronic device may capture an image of a financial document, such as a check. The user can submit the image of the check to a financial institution server for processing and deposit of the check. However, the check image may be degraded in multiple ways. The check may be overly cropped by the user such that important fields or portions of the check are cropped out of the check image. The image capture device of the electronic device may capture a blurry image of the check. These degradations may inhibit subsequent processing of the check image.


BRIEF SUMMARY

The descriptions below include methods, systems, logic, and devices for processing an image and determining a number of characters in the image with recognizing or attempting to recognize the actual text or characters in the image. In one aspect, a method is performed by circuitry, such as a processor, in an electronic device. The method performed by the circuitry includes receiving a financial document image and identifying a text chunk in the financial document image by determining a first pixel of the financial document image as the top left pixel of the text chunk based on a top left score of the first pixel and determining a second pixel of the financial document image as top right pixel of the text chunk based on a top right score of the second pixel. The method further includes determining a character count for the text chunk without recognizing any particular character in the text chunk.


In another aspect, a system includes a memory and a processor. The processor is operable to receive a financial document image and identify a text chunk in the financial document image by determining a first pixel of the financial document image as top left pixel of the text chunk based on a top left score of the first pixel and determining a second pixel of the financial document image as top right pixel of the text chunk based on a top right score of the second pixel. The processor is also operable to determine a chunk extension for the text chunk and add the chunk extension to the text chunk. After adding the chunk extension to the text chunk, the processor is operable to determine a character count for the text chunk without recognizing any particular character in the text chunk.


In another aspect, a non-transitory computer readable medium includes instructions that, when executed by a processor, cause the processor to receive a financial document image; identify an interest region in the financial document image; and identify a text chunk in the interest region of the financial document image. The instructions cause the processor to identify the text chunk by determining a first pixel of the financial document image as top left pixel of the text chunk based on a top left score of the first pixel and determining a second pixel of the financial document image as top right pixel of the text chunk based on a top right score of the second pixel. The instructions also cause the processor to determine a character count for the text chunk in the interest region of the financial document image without recognizing any particular character in the text chunk; determine a character count for the interest region by summing the character count for the text chunk with a character count of any additional text chunks in the interest region; and determine whether the character count for the interest region exceeds a minimum character count threshold specifically for the interest region.





BRIEF DESCRIPTION OF THE DRAWINGS

The innovation may be better understood with reference to the following drawings and description. In the figures, like reference numerals designate corresponding parts throughout the different views.



FIG. 1 shows an example of a system for determining a character count of an image.



FIG. 2 shows an example of a financial document image.



FIG. 3 shows an example of a top left scoring grid for an image pixel.



FIG. 4 shows an example of a top left score array.



FIG. 5 shows an example of a top right scoring grid for an image pixel.



FIG. 6 shows an exemplary flow for determining a top right pixel of a text chunk.



FIG. 7 shows examples of text chunks the processing circuitry may determine.



FIG. 8 shows an example of logic for identifying one or more text chunks in an image.



FIG. 9 shows an exemplary flow for counting characters in a text chunk.



FIG. 10 shows an example of logic for obtaining a character count for a text chunk.



FIG. 11 shows an example of logic that may be implemented in hardware, software, or both.





DETAILED DESCRIPTION


FIG. 1 shows an example of a system 100 for determining a character count of an image. The system 100 in FIG. 1 includes an electronic device 102 communicatively linked to an image processing system 104 through a communication network 106. The electronic device 102 may include or communicate with an image capture device that captures digital images, such as a digital camera or scanning device. In FIG. 1, the electronic device 102 is a mobile communication device, e.g., a cellular telephone with a digital camera. However, the electronic device 102 may take any number of forms, including as examples a laptop, digital camera, personal digital assistant (PDA), tablet device, portable music player, desktop computer, any image scanning device, or others. The electronic device 102 may also include any number of communication interfaces supporting communication through the communication network 106.


The communication network 106 may include any number of networks for communicating data. In that regard, the communication network 106 may include intermediate network devices or logic operating according to any communication number of mediums, protocols, topologies, or standards. As examples, the communication network 106 may communicate across any of the following mediums, protocols, topologies and standards: Ethernet, cable, DSL, Multimedia over Coax Alliance, power line, Ethernet Passive Optical Network (EPON), Gigabit Passive Optical Network (GPON), any number of cellular standards (e.g., 2G, 3G, Universal Mobile Telecommunications System (UMTS), GSM® Association, Long Term Evolution (LTE)™, or more), WiFi (including 802.11 a/b/g/n/ac), WiMAX, Bluetooth, WiGig, and more.


The exemplary system 100 shown in FIG. 1 also includes the image processing system 104. As described in greater detail below, the image processing system 104 may determine a character count for an image received by the image processing system 104. The image processing system 104 may determine the character count for the image without recognizing the content or identity of characters in the image.


In some implementations, the image processing system 104 may include a communication interface 110, processing circuitry 112, and a user interface 114. The processing circuitry 112 of the image processing system 104 may perform any functionality associated with the image processing system 104, including any combination of the image processing techniques and methods described below. In one implementation, the processing circuitry 112 includes one or more processors 116 and a memory 120. The memory 120 may store image processing instructions 122 and character count parameters 124. The character count parameters 124 may include any parameters, settings, configurations, or criteria that control how the processing circuitry 112 determines process an image, including determining of a character count for the image. In some variations, the electronic device 102, such as a mobile device, may additionally or alternatively implement any of the functionality of the processing circuitry 112 described herein.


The processing circuitry 112 may process any digital image that may include text. As examples, the processing circuitry 112 may process an image of any type of financial document, including negotiable instruments such as personal checks, business checks, money orders, promissory notes, certificate of deposits, and more. As additional examples, the processing circuitry 112 may process images of any other type, such as a image of any type of insurance form or document, tax documents (e.g., form 1040), employment forms, savings bonds, traveler's checks, job applications, any type of bill, such as an automotive repair bill or medical bill, a remittance coupon, and images of many more types.



FIG. 2 shows an example of a financial document image 200 that the processing circuitry 112 may determine a character count for. A financial document image may refer to an image that includes any portion of a financial document. In particular, the example in FIG. 2 shows an image of a check, though the financial document image 200 may take various forms. The processing circuitry 112 may receive the financial document image 200 and apply any number of image processing functions prior to performing a character count process for the financial document image. For instance, the processing circuitry 112 may determine one or more corners or edges of the financial document image 200, deskew the image 200 to remove distortions, adjust the contrast of the image 200 to ease subsequent processing, apply any number of image cleaning algorithms, de-blur the image 200, binarize the image into black and white pixels, and more. As another example, the processing circuitry 112 may adjust the size of the image 200 to a particular height or width (e.g., in pixels) or to a particular proportion of the originally received image (e.g., shrink the image 200 to ¼ of the original size). The processing circuitry 112 may resize the image such that expected text in the financial document image (e.g., text of a particular font, particular font size, particular font size range, etc.) is of a particular pixel height, width, or within a particular pixel range.


The processing circuitry 112 may convert the financial document image 200 into a pixel array for processing. To illustrate, the financial document image 200 in FIG. 2 includes the image portion labeled as 210. The exemplary image portion 210 includes a number of pixels binarized into either white pixels or black pixels, as seen in the expanded image portion 210 reproduced and expanded in FIG. 2 above the financial document image 200. The processing circuitry 112 may represent the image portion 210 as a pixel array of black and white pixels. As one example, the processing circuitry 112 may represent the financial document image 200, including the image portion 210, as a two-dimensional array of array values, where an array value of ‘0’ indicates a white pixel and an array value of ‘1’ indicates a black pixel. The processing circuitry 112 may perform any of the image processing steps described above in preparation of determining the character count for the financial document image 200.


Exemplary processes through which the processing circuitry 112 may determine or estimate a character count for the financial document image 200 are presented next. First, the processing circuitry 112 may identify one or more text chunks in the financial document image 200, for example as described through FIGS. 3-8. Next, the processing circuitry 112 may determine a character count for some or all of the identified text chunks, for example as described through FIGS. 9-10.


Identifying Text Chunks


The processing circuitry 112 may identify one or more text chunks in the financial document image 200. A text chunk may refer to a particular portion of set of pixels of the financial document image 200 that may contain one or more text characters. In doing so, the processing circuitry 112 may evaluate the financial document image 200 to determine the likelihood particular portions (e.g., particular pixels) of the image 200 correspond to the boundary of a character, such as any edge, a top right corner, a top left corner, or other boundary portion of a text character.


In particular, the processing circuitry 112 may determine a likelihood that an image pixel corresponds to or is within a particular distance from the top left corner of a character. In some variations, the processing circuitry 112 may apply a top left scoring algorithm for pixels in the financial document image 200 to specify this likelihood. In scoring a particular pixel, the top left scoring algorithm may account for any number of other pixels surrounding the particular pixel. One such example is presented next in FIG. 3.



FIG. 3 shows an example of a top left scoring grid 301 for an image pixel 302. A scoring grid of pixels may represent a set of pixels surrounding a particular pixel, and the processing circuitry 112 may use the top left scoring grid 301 to apply a top left scoring algorithm to the image pixel 302. The top left scoring grid 301 may take various sizes or shapes, which may be specified through the character count parameters 124. The character count parameters 124 may, for instance, specify a height and/or width of the top left scoring grid 301. As shown in FIG. 3, the processing circuitry 112 determines the top left scoring grid 301 as a 6 by 6 array of pixels with the image pixel 302 being evaluated by the top left scoring algorithm positioned as the top left pixel of the top left scoring grid 301.


The character count parameters 124 may specify dimensions for a scoring grid according to any number of factors, some of which are presented next. The processing circuitry 112 may resize the financial document image 200 such that expected text of the image has a particular size, e.g., a MICR line, courtesy amount line, or other particular text in the financial document image 200 has particular pixel height, width, or pixel size range. The character count parameters 124 may specify, for example, dimensions for the top left scoring grid 301 such that the top left scoring grid 301 (or a non-padded portion thereof as discussed in greater detail below) covers a predetermined portion of an expected text character in the financial document image 200. As another variation, the character count parameters 124 may specify a scoring grid size that covers ⅓ the width an expected text character and ½ the height of an expected text character, which may be specified in pixels.


In some implementations, the character count parameters 124 specify the dimensions of a scoring grid to account for a particular pixel density of the financial document image 200. For instance, the character count parameters 124 may specify a particular dimension (e.g., 6 pixels wide by 9 pixels high) for scoring grid given a particular pixel density of the image 200 (e.g., for a 200 Dots-Per-Inch image). Additionally or alternatively, the character count parameters 124 may specify a scoring grid dimension to account for a minimum expected font size or minimum relevant font size in an image, for which the pixel size may vary depending on how the image was resized by the processing circuitry 112.


The processing circuitry 112 may determine a top left score for a pixel. The scoring algorithm may implement a scoring range indicative of the likelihood that the image pixel 302 corresponds to a top left corner of a character or is within a particular padded distance from the top left corner of a character. With regards to a padded distance, the top left scoring algorithm may include a padding parameter. The padding parameter may specify a particular padding of white pixels that surround the top left corner of a character. For example, with a padding parameter value of 2, the image pixel 302 may have an increased top left score when the image pixel 302 is two pixels above and two pixels to the left of a top left corner pixel of a text character. For top left scoring, the character count parameters 124 may specify a top padding parameter, a left padding parameter, or both.


The processing circuitry 112 may determine a top left score for the image pixel 302 according to the distribution of white and/or black pixels in the top left scoring grid 301. With padding, the scoring algorithm may indicate a higher likelihood of the image pixel 302 corresponding to a top left corner of a character when particular portions of the top left scoring grid 301 are white pixels, e.g. a white padded portion of the top left scoring grid 301. Along similar lines, the scoring algorithm may indicate a higher likelihood of a pixel corresponding to the top left corner when particular portions of the top left scoring grid 301 are black pixels, e.g., a black character portion. As one example, for a padding parameter value of 2 (for both top and left), the character count parameters 124 may specify an ideal distribution of white and black pixels in a 6×6 top left scoring grid 301 as the following configuration:






















W
W
W
W
W
W



W
W
W
W
W
W



W
W
B
B
B
B



W
W
B
B
B
B



W
W
B
B
B
B



W
W
B
B
B
B











The ideally white (W) pixels in the above ideal configuration may form the white padded portion for determining a top left score and the ideally black pixels (B) may form the black character portion for determining a top left score. When determining a top left score for a pixel, the processing circuitry 112 may determine the proportion of the white padded portion of the top left scoring grid 301 for that pixel that includes white pixels and the proportion of the black character portion that includes black pixels.


The processing circuitry 112 may apply weights when evaluating the pixels in the top left scoring grid 301. That is, the processing circuitry 112 may give more or less weight when a particular pixel in a particular position in the top left scoring grid 301 is either white or black. For instance, the character count parameters 124 may specify greater weight for pixels that are closer to a particular pixel, edge, or region in the top left scoring grid 301. One exemplary weighting for a 6×6 top left scoring grid 301 with a padding parameter value of 2 is as follows:






















1
1
1
1
1
1



1
2
2
2
2
2



1
2

4


4


4


4




1
2

4


3


3


3




1
2

4


3


2


2




1
2

4


3


2


1












In the above weighting, white pixels in the white padded portion are given a weight (e.g., multiplier) by 1 or 2. Black pixels in the white padded portion may be given a weight of 0. Black pixels in the black character portion are given a weight of 4, 3, 2 or 1, as shown by the underlined weights for pixels in the black character portion. White pixels in the black character portion may be given a weight of 0. In that regard, the processing circuitry 112 may determine a weighted proportion of white pixels in the white padded portion (e.g., a white padded score), for example by dividing a weighted sum for the pixels in a white padded portion by the ideal weighted value for the white padded portion. The processing circuitry 112 may determine a black character score in a consistent manner as well.


To illustrate, the processing circuitry 112 may determine the top left score for the image pixel 302 with the particular top left scoring grip 301 depicted in FIG. 3 according to a padding value of 2 and the exemplary weights shown above. The processing circuitry 112 may calculate a white padded portion score and black character score for the top left scoring grid 301. In particular, the white padded portion of the top left scoring grid 301 shown in FIG. 3 has pixel array of:






















W
W
W
W
W
W



W
W
W
W
W
W



W
B







W
B







W
B







W
B















Applying the weights for white and black pixels in the white padded portion, the processing circuitry 112 may determine the following weighted values for the white padded portion:






















1
1
1
1
1
1



1
2
2
2
2
2



1
0







1
0







1
0







1
0















Summing the weighted values, the processing circuitry 112 may determine the weighted sum for the white padded portions as 21. The processing circuitry 112 may identify the ideal weighted value (e.g., when all the pixels in the white padded portion are white) as 29. Accordingly, the processing circuitry 112 may determine the white padded score of the top left scoring grid 301 shown in FIG. 3 as 21/29=0.72.


The processing circuitry 112 may similarly determine a black character score of the top left scoring grid 301. The processing circuitry 112 may apply the exemplary weighting shown above for the black character portion of the top left scoring grid 301 in FIG. 3. In doing so, the processing circuitry 112 may determine the following weighted values for the black character portion:










































4
0
0
0





4
3
0
3





4
3
2
2





4
3
2
1











In this example, the processing circuitry 112 may determine the weighted sum for the black character portion as 35 and the ideal weighted sum (e.g., when all the pixels in the black character portion are black) as 50. Accordingly, the processing circuitry 112 may, in one implementation, determine the weighted black character score of the top left scoring grid 301 shown in FIG. 3 as 35/50=0.70.


The processing circuitry 112 may additionally apply weights when accounting for the white padded score and the black character score. When weighted equally, the processing circuitry 112 may determine the top left score of the image pixel 302 as the average the white padded score and black character score. In this example, the processing circuitry 112 determines the top left score of the image pixel 302 as (0.5)*0.72+(0.5)*0.70=0.71, as shown in FIG. 3. In other variations, the processing circuitry 112 may apply a greater weight to the white padded score than the black character score or vice versa.


The processing circuitry 112 may determine the top left score for the image pixel 302 as well as for any number of other pixels in the financial document image 200. The top left score determination method above may provide an quick and efficient method for determining the likelihood a particular pixel corresponds to the top left corner of a text character. The processing circuitry 112 may determine a respective top left score for pixels and identify pixels with a greater likelihood of corresponding to the top left corner of a character without, for example, performing edge detection processes or other processing-intensive processes.


While some particular examples have been presented above, the character count parameters 124 may specify any number of different configurations for determining a top left score, including varying height and width dimensions for the top left scoring grid 301, varying padding parameter values (including top padding, left padding, or both), as well as varying weight configurations, such as weights applied to particular pixels in the white padded portion or the black character portions, or to the white padded and black character scores. Another weighting configuration for a 6×9 top left scoring grid 301 with a top and left padding of 3 may be as follows (with black character portion weights underlined):






















1
1
1
1
1
1



1
2
2
2
2
2



1
2
3
3
3
3



1
2
3

3


2


1




1
2
3

2


2


1




1
2
3

1


1


1




1
2
3

1


1


1




1
2
3

1


1


1




1
2
3

1


1


1












In this example, the white padded portion is weighted along the edge of the top and left edges of a character and the black character portion is weighted to emphasize the top left pixel of the character. The character count parameters 124 may implement any number of varying configurations through which the processing circuitry 112 determines the top left score for pixels in the financial document image 200.



FIG. 4 shows an example of a top left score array 400. In particular, the top left score array 400 shown in FIG. 4 includes top left scores for a row of pixels in the financial document image 200, e.g., after the processing circuitry 112 has determined the top left scores for that particular row of pixels. Along similar lines, the processing circuitry 112 may determine the top left scores for the remaining pixels of the financial document image 200 as well. In the exemplary top left score array 400 shown in FIG. 4, the processing circuitry 112 may determine the respective top left score for pixels in the financial document image 200 according to the following parameters: using a 6×9 pixel dimension for a top left scoring grid, a padding parameter value of 2, and an equal weight (e.g., half or 0.5) for the white padded score and black character score.


The processing circuitry 112 may determine a respective top left score for some or all of the pixels in the financial document image 200. For example, the processing circuitry 112 may abstain or forego determining the top left score for a pixel when the pixel is in a particular region of the financial document image 200, e.g., in the bottom-most row of the image 200, within a predetermined number of rows from the bottom-most row, in the right-most column of the image 200, or within a predetermined number of rows from the right-most column. As another example, the processing circuitry 112 may selectively determine the top left scores for pixels within a predetermined pixel distance from an interest region of the financial document image 200, such as the Magnetic Ink Character Recognition (MICR) location of a check, from a particular form field of an insurance document, and the like.


Before, during, or after determining the top left score for pixels in the financial document image 200, the processing circuitry 112 may determine a top right score for one or more pixels in the financial document image 200. In that regard, the processing circuitry 112 may determine a likelihood that an image pixel corresponds to or is within a particular pixel distance from the top right corner of a text character. The processing circuitry 112 may apply a scoring algorithm similar in many respects to top left scoring algorithm described above, but with any number of variances. For example, the configuration, weights, and other parameters specified by the character count parameters 124 for determining the top right score may be vertically mirrored from those used for determining a top left score.


The character count parameters 124 may specify distinct parameters through which the processing circuitry 112 determines a top right score for a pixel. In that regard, the character count parameters 124 may specify different configurations for a top right scoring grid as compared to the top left scoring grid, including differences in scoring grid dimensions, weights applied to pixels within a top right scoring grid, etc. In particular, the processing circuitry 112 may use a top padding parameter value and/or right padding parameter value in determining the top right score for a pixel, but not a left padding parameter value (as compared to the top left score determination parameters that may include a top left padding parameter value but not a top right padding parameter value).



FIG. 5 shows an example of a top right scoring grid 501 for an image pixel 502. The top right scoring grid 501 in FIG. 5 has a height and width of 6 pixels and a padding parameter value of 2. In this example, the character count parameters 124 may specify an ideal distribution of white and black pixels in the 6×6 top right scoring grid 501 as the following configuration:






















W
W
W
W
W
W



W
W
W
W
W
W



B
B
B
B
W
W



B
B
B
B
W
W



B
B
B
B
W
W



B
B
B
B
W
W










The ideally white (W) pixels in the above ideal configuration may form the white padded portion for determining the top right score for a particular pixel, e.g., the top right pixel of the top right scoring grid 501. The ideally black (B) pixels in the above ideal configuration may form the black character portion for determining the top right score. One exemplary weighting for a 6×6 top right scoring grid 501 with a padding value of 2 is as follows (with black character weights underlined):






















1
1
1
1
1
1



2
2
2
2
2
1




4


4


4


4

2
1




3


3


3


4

2
1




2


2


3


4

2
1




1


2


3


4

2
1











As seen, this exemplary weighting for a 6×6 top right scoring grid 501 is vertically mirrored from the exemplary weighting for a 6×6 top left scoring grid 301 discussed above.


The processing circuitry 112 may apply the weights to the top right scoring grid 501 for the image pixel 502 specifically shown in FIG. 5, which may be represented by the following pixel array (pixels in the black character portion underlined):






















W
W
W
W
W
W



W
W
W
W
W
W




W


W


B


B

W
W




B


W


B


B

W
W




B


B


B


B

W
W




B


B


B


B

W
W











The processing circuitry 112 may determine the weighted sum of the white padded portion to be 29 and the ideal weighted sum to be 29. In this example, the processing circuitry 112 determines the white padded score as 29/29=1.0. Following consistent respective calculations, the processing circuitry 112 may determine the black character score as 0.70. The processing circuitry 112 may, for example, apply the same weight to each score and determine the top right score of the image pixel 502 as (0.5)*1.0+(0.5)*(0.70)=0.85, as shown in FIG. 5. In a similar way, the processing circuitry 112 may determine a respective top right score for some or all of the pixels in the financial document image 200.


The processing circuitry 112 may determine top right scores and top left scores for some or all of the pixels in the financial document image 200. For a particular pixel, the processing circuitry 112 may determine a top right score for the particular pixel, a top left score for the particular pixel, or both. After determining top right and top left scores for pixels of the financial document image 200, the processing circuitry 112 may use the determined top right scores and top left scores to identify text chunks in the financial document image 200.


The processing circuitry 112 may identify a text chunk by determining one or more boundary pixels or edges for the text chunk. As one exemplary process described in greater detail below, the processing circuitry 112 may determine a top left pixel of the text chunk, a top right pixel of the text chunk, and a bottom edge of the text chunk. In that regard, the processing circuitry 112 may sequentially consider pixels in the financial document image 200 to identify a boundary of a text chunk. For example, the processing circuitry 112 may start the sequential processing of pixels for the text chunk determination process at the top left pixel of the financial document image 200. Or, the processing circuitry 112 may start with a pixel belongs to a particular portion of the financial document image 200, e.g., a MICR line portion of a check image.


The processing circuitry 112 may identify pixels or boundaries for a text chunk according to any number of chunk boundary criteria, which may be specified by the character count parameters 124. For a current pixel being considered for text chunk identification, the processing circuitry 112 may first determine whether the current pixel is already part of a previously determined text chunk. If so, the processing circuitry 112 may exclude the current pixel from belonging to another text chunk and proceed to a subsequent pixel for consideration.


When a current pixel is not part of a previously determined text chunk, the processing circuitry 112 may determine whether the current pixel meets chunk boundary criteria for a top left pixel of the text chunk. The processing circuitry 112 may identify the current pixel as the top left pixel for a text chunk when the top left score of the current pixel is equal to or exceeds a top left score threshold, such as a top left score threshold of 0.65 in some implementations. Accordingly, the processing circuitry 112 may identify a top left pixel for the text chunk without performing additional or more complicated image processing techniques, e.g., without performing edge detection algorithms. After determining a top left pixel for a text chunk, the processing circuitry 112 may determine the top right pixel for the text chunk.


The processing circuitry 112 may determine a top right pixel for the text chunk by evaluating pixels to the right of the determined top left pixel of the text chunk. In that regard, the processing circuitry 112 may determine a set of potential top right pixels based on the top left scores of the pixels being evaluated, e.g., top right candidate pixels. In one implementation, the processing circuitry 112 determines the top right candidate pixels for the text chunk as a set of consecutive pixels with a top left score below a top left score threshold, as set by the character count parameters 124. The character count parameters 124 may specify the same or a different top left score threshold used for identifying the top left pixel of the text chunk and the top right candidate pixels of the text chunk.


To illustrate, the processing circuitry 112 may start at the determined top left pixel of the text chunk and sequentially consider pixels to the right of the top left pixel. When the current pixel has a top left score less than the top left score threshold for identifying a top right pixel (e.g., 0.65), the processing circuitry 112 may increment a counter value and continue to the next pixel. When the current pixel has a top left score equal to or greater than the top left score threshold for identifying a top right pixel (e.g., 0.65), the processing circuitry 112 may reset the counter to 0 and continue to the next pixel. When the counter value reaches a counter threshold value (e.g., 13), the processing circuitry 112 may identify a number of previously considered pixels equal to the counter threshold value as the top right candidate pixels (e.g., the 13 previously considered pixels when the counter threshold value is 13). An exemplary iteration of this process is presented in FIG. 6.



FIG. 6 shows an exemplary flow 600 for determining a top right pixel of a text chunk. In the exemplary flow 600 in FIG. 6, the processing circuitry 112 reads the character count parameters 124, which may specify a top left score threshold for identifying the top right pixel as 0.65 and the counter threshold value as 13. As the processing circuitry 112 evaluates successive pixels, the processing circuitry 112 either increments a counter value when the top left score of the pixel is less than 0.65 or resets the counter value when the top left score of the pixels is equal to or greater than 0.65. In this example, when the processing circuitry 112 identifies thirteen (13) consecutive pixels with a top left score less than 0.65, the processing circuitry 112 determines the top right candidate pixels 610 for the text chunk.


The processing circuitry 112 may determine the top right pixel for the text chunk from among the top right candidate pixels 610. In some implementations, the processing circuitry 112 identifies the pixel with the highest top right score from among the top right candidate pixels 610 as the top right pixel for the text chunk. In the example shown in FIG. 6, the processing circuitry 112 determines the pixel with the top right score of 0.83 as the top right pixel 620 for the text chunk. The processing circuitry 112 may identify a top right pixel for the text chunk without performing additional or more complicated image processing techniques, e.g., without performing edge detection algorithms.


After determining a top left and top right corner for a text chunk, the processing circuitry 112 may determine a bottom edge of the text chunk. In doing so, the processing circuitry 112 may consider rows of pixels below the particular row of pixels formed by and between the top left and top right pixels, e.g., the top row of the text chunk. The processing circuitry 112 may identify a first row of pixels below the top row of the text chunk formed by the top left and top right pixels with a proportion of white pixels that exceeds a bottom edge threshold. In some variations, the character count parameters 124 may set the bottom edge threshold at 90%, for example. The processing circuitry 112 may determine the bottom edge of the text chunk as the upper edge of the identified first row of pixels with a proportion of white pixels that exceeds the bottom edge threshold.


In determining the bottom edge of the pixel chunk, the processing circuitry 112 may ignore or not consider a number of pixel rows at the top of the text chunk equal to the padding parameter value. For example, when the padding parameter value is set to 2, the processing circuitry 112 may not consider the top two rows of pixels formed between the top left and top right pixels when identifying the bottom edge of the text chunk. Put another way, the processing circuitry 112 may forego considering the top row of pixels that includes the top left and top right pixels and the next row of pixels directly below the top row when the padding parameter value is 2. Accordingly, the processing circuitry 112 may determine a text chunk formed by a top left pixel, a top right pixel, and a bottom edge. The processing circuitry 112 may further process the text chunk as well, some examples of which are shown in FIG. 7.



FIG. 7 shows examples of text chunks 710, 720, and 730 the processing circuitry 112 may determine. The processing circuitry 112 may determine the text chunk labeled as 710 with boundaries set by the top left pixel 711, top right pixel 712, and the bottom edge 713 using any of the methods or techniques described above. In particular, in determining the text chunk 710, the character count parameters 124 may specify a padding parameter value of 2, and accordingly, the processing circuitry 112 may forego considering the first two rows of pixels in the text chunk 710 when determining the bottom edge 713. As seen in FIG. 7, the row of pixels below the bottom edge 713 is composed entirely of white pixels, and the processing circuitry 112 may identify this row as the first row of pixels that exceed a bottom edge threshold. Accordingly, the processing circuitry 112 may determine the bottom edge 713 of the text chunk 710 as the upper edge of this row of white pixels exceeding the bottom edge threshold.


In some variations, the processing circuitry 112 may pad the bottom edge 713 of a determined text chunk 710. In that regard, the processing circuitry 112 may add a number of rows of white pixels below the bottom edge 713 as set by the character count parameters 724. In the example shown in FIG. 7, the processing circuitry 112 determine the text chunk 720 by padding the text chunk 710 with two rows of white pixels, e.g., through setting the padded bottom edge 723 as the bottom edge of a text chunk.


In some variations, the processing circuitry 112 may adjust the left or right edges of a determined text chunk. For example, the processing circuitry 112 may pad the left edge of a text chunk, right edge of a text chunk, or both as similarly described above with regards to padding the bottom edge 713 of the text chunk 710. The processing circuitry 112 may additionally or alternatively adjust the left or right edges of a text chunk to include a chunk extension. In FIG. 7, the processing circuitry 112 may determine the text chunk 730 by adjusting the text chunk 720 to include the chunk extension 740.


The processing circuitry 112 may determine a chunk extension for a text chunk. In doing so, the processing circuitry 112 may consider the columns of pixels to the right or left of an edge of a text chunk and determine occurrence of a threshold number of consecutive white pixel columns, e.g., a consecutive number of columns each or which and/or collectively have a proportion of white pixels that exceed an extension threshold, such as 90% white pixels. The columns considered by the processing circuitry 112 may be to the left or right of the text chunk and have the same height as the text chunk. The processing circuitry 112 may determine the chunk extension as the section of pixels between the edge of the text chunk and the identified consecutive white pixels columns.


As one particular example shown in FIG. 7, the processing circuitry 112 may determine the text chunk 720. Then, the processing circuitry 112 may determine the chunk extension 740 when, for example, the next threshold number of (e.g., the next 20) pixel columns to the right of the chunk extension 740 in the financial document image 200 are each 90% (or more) white. Accordingly, the processing circuitry 112 may determine the text chunk 730 by appending the chunk extension 740 to the text chunk 720. As such, the processing circuitry 112 may identify the top right pixel 732 in the chunk extension 740 as part of the boundary of the text chunk 730.


In processing the financial document image 200, the processing circuitry 112 may determine the text chunks 710 and 720 shown in FIG. 7 as intermediate text chunks and the text chunk 730 as the determined text chunk. However, the character count parameters 124 may include any number of additional or alternative parameters for determining intermediate text chunks before obtaining a determined text chunk.



FIG. 8 shows an example of logic 800 for identifying one or more text chunks in an image. The logic 800 may be implemented in hardware, software, or both. For example, the processing circuitry 112 may implement the logic 800 in software as the image processing instructions 122.


The processing circuitry 112 may read the character count parameters 124 (802) and receive an image (804). The image may, for example, be a financial document image 200 such as a check or insurance form. The processing circuitry 112 may perform various pre-processing on the image, such as cleaning up the image, resizing the image to control the text size of expected text in the image (e.g., text of a MICR line or courtesy line in a check), binarizing the image, or converting the image to a pixel array.


The processing circuitry 112 may determine a respective top left score for some or all of the pixels in the image (804). The processing circuitry 112 may determine a respective top right score for some or all of the pixels in the image (806), including for pixels different from the pixels for which the processing circuitry 112 determines respective top left scores. In some implementations, the processing circuitry 112 may determine the top left scores and/or top right scores for specific portions (e.g., interest regions) of the image, and decline or skip determining the top left and/or top right scores for pixels outside of the interest regions of the image. As one example, the character count parameters 124 may specify a MICR line region, upper left hand corner, amount region on the middle right side, or other portions of a check image as interest regions.


Upon determining the top left and top right scores for image pixels in the image, the processing circuitry 112 may sequentially process pixels in the image. In that regard, the processing circuitry may determine whether any additional pixels in the image remain for processing (810), e.g., whether any unprocessed pixels remain in the interest region(s) of the image. If so, the processing circuitry 112 may set a current pixel (812). The processing circuitry 112 may set the current pixel as the next pixel in a pixel processing ordering. For example, the processing circuitry 112 may process pixels ordering from left to right and row by row from the top left corner of the image to the bottom right corner of the image or interest region.


For a current pixel, the processing circuitry 112 determines whether the current pixel is already part of a previously formed text chunk (816). For example, the processing circuitry 112 may access a listing of determined text chunks for the image, and determine whether the pixel is already part of another determined text chunk. If so, the processing circuitry 112 may proceed to consider a subsequent pixel of the image or interest region, if any remain (810).


The processing circuitry 112 may determine the boundaries of a text chunk. In that regard, the processing circuitry 112 may determine a top left corner for the text chunk. The processing circuitry 112 may determine, for example, whether the top left score of the current pixel exceeds (or alternatively, is equal to or greater than) a top left score threshold, which may be set by the character count parameters 124. When the top left score of the current pixel does not exceed the top left score threshold, the processing circuitry 112 may proceed to consider a subsequent pixel of the image or interest region, if any remain (810). When the top left score of the current pixel exceeds the top left score threshold, the processing circuitry 112 may set the current pixel as the top left corner of the text chunk (820).


Continuing the boundary determination for a text chunk, the processing circuitry 112 may determine a top right pixel for the text chunk (822) through any of the methods or techniques described above. For example, the processing circuitry 112 may determine a set of top right candidate pixels from the image, and identify the top right corner of the text chunk as the pixel from among the top right candidate pixels with the highest top right score (e.g., the pixel most likely to correspond to the top right corner or a text character as specified by top right score). The processing circuitry 112 may also determine the bottom edge of the text chunk (824) through any of the processes and techniques described above.


The processing circuitry 112 may further adjust the boundaries of a text chunk. In some variations, the processing circuitry 112 determines one or more chunk extensions (826) through which to extend the left edge or right edge (or both) of a text chunk. The processing circuitry 112 may additionally or alternatively pad the text chunk with white pixels, for example as specified by padding parameter(s) in the character count parameters 124. Using any combination of the techniques, process, or steps described above, the processing circuitry 112 may determine a text chunk.


Upon determining a text chunk, the processing circuitry 112 may validate the text chunk (830). As an exemplary validation, the processing circuitry 112 may determine whether the height of text chunk (e.g., pixel height) exceeds a minimum height threshold (e.g., 10 pixels). As another example, the processing circuitry may determine whether the height of the text chunk is within a maximum height threshold (e.g., 50 pixels). In some variations, the processing circuitry 112 may validate that all (or a threshold percentage) of the pixels in the text chunk are not a part of another determined text chunk. In these variations, the processing circuitry 112 may access a listing of previously determined text chunks to determine whether pixels of the text chunk belong to any of the previously determined text chunks. When the text chunk passes the validation process, the processing circuitry 112 may store the text chunk (832), e.g., by storing an indication of the text chunk in the determined text chunk listing. The indication may, for example, take the form of a database or data structure entry and may specify the boundary and/or pixels belonging to the associated text chunk. Then, the processing circuitry 112 may consider the subsequent pixel of the image or interest region, if any remain (810). When the text chunk fails the validation, the processing circuitry 112 may discard the text chunk and not store the text chunk in the determined text chunk listing. That is, the processing circuitry 112 may proceed to consider the subsequent pixel of the image or interest region (810) without storing an indication of the text chunk.


Determining Character Count


After identifying text chunks in an image, e.g., a financial document image 200, the processing circuitry 112 may determine a character count for the text chunks. As described in greater detail below, the processing circuitry 112 may determine the character count for a text chunk without specifically recognizing the identity or content of any particular text characters in the text chunk. For example, the processing circuitry 112 may determine the character count for the text chunk without performing any character recognition techniques, such as Optical Character Recognition (OCR) or other similar character recognition techniques.



FIG. 9 shows an exemplary character count 900 in a text chunk 910. The processing circuitry 112 may perform the character count 900 to determine a character count for the text chunk 910 without recognizing any particular characters in the text chunk 910. That is, the text chunk 910 shown in FIG. 9 includes the text “Wilmington, Del.” and the processing circuitry 112 may obtain a character count for the text chunk 910 without recognizing the letters or the comma within the text chunk 910.


The processing circuitry 112 may process the text chunk 710 to determine a character start column and a corresponding character end column. To do so, the processing circuitry 112 may start at the leftmost pixel column of the text chunk and sequentially consider pixel columns in text chunk 710. The processing circuitry 112 may identify a character start column when the number of black pixels in a current column exceeds a black column threshold, which may be specified in the character count parameters 124 as a number of pixels or percentage, for example. In the specific example shown in FIG. 9, the processing circuitry 112 identifies the character start column when a current pixel column has at least one black pixel or greater than 0% of black pixels. The processing circuitry 112 may identify the character start columns shown in FIG. 9 that are marked with the dotted arrows and labeled with a corresponding character start number.


Upon identifying a character start column, the processing circuitry 112 may continue to sequentially consider pixel columns to the left of the character start column to identify a corresponding character end column. The processing circuitry 112 may identify the corresponding character end column as the first pixel column to the right of the character start column with white pixels that exceed a white column threshold. The processing circuitry 112 may identify a character end column when a current pixel column has less than 2 black pixels, for example. As seen in the exemplary text chunk 910 in FIG. 9, the processing circuitry 112 identifies the character end columns marked with the non-dotted arrows and labeled with a corresponding character end number. After identifying a corresponding character end column for a particular character start column, the processing circuitry 112 may increment a character count value and continue sequentially considering pixel columns of the text chunk 910 to determine a next character start column. The processing circuitry 112 may continue the character count process until reaching the end of the text chunk 910, e.g., considering the rightmost column of the text chunk 910. In the example shown in FIG. 9, the processing circuitry 112 determines character count of the text chunk 910 be 13, and determines the character count without recognizing the content or identity of any of the characters in the text chunk 910.



FIG. 10 shows an example of logic 1000 for obtaining a character count for a text chunk. The logic 1000 may be implemented in hardware, software, or both. For example, the processing circuitry 112 may implement the logic 1000 in software as the image processing instructions 122.


The processing circuitry 112 may read the character count parameters 124 (1002) and obtain a text chunk (1004). In some implementations, the processing circuitry 112 may obtain the chunk by accessing a text chunk listing or data structure, which may provide, for example, an indication of the boundaries of a particular text chunk in an image. The text chunk may be in the form of a pixel array.


In determining a character count for a text chunk, the processing circuitry 112 may process one or more pixel columns in the text chunk. The processing circuitry 112 may determine a character start column in the text chunk and then a corresponding character end column. To do so, the processing circuitry 112 may process the pixel columns in the chunk in according to a pixel column processing ordering. For example, the processing circuitry 112 may process pixel columns in the text chunk in a sequential order from the left most pixel column to the right most pixel column. Accordingly, the processing circuitry 112 may determine whether any additional pixel columns in the text chunk remain for processing (1006). If so, the processing circuitry 112 may set the next pixel column in the pixel column processing ordering as the current column for determining a character start column (1008).


The processing circuitry 112 may identify a character start column when a current pixel column meets any number of character start column criteria. The processing circuitry 112 may identify a character start column based on a black column threshold, which may specify a percentage, proportion, or number of black pixels in a pixel column. Accordingly, the processing circuitry 112 may identify a character start column by determining whether the number or proportion of black pixels in the current pixel column exceeds a black column threshold (1010). If not, the processing circuitry 112 may consider the next pixel column in the text chunk for determining a character start column, if any remain (1006). When the number or proportion of black pixels in the current pixel column exceeds the black column threshold, the processing circuitry 112 identifies this particular pixel column as a character start column (1012).


The processing circuitry 112 may determine a corresponding character end column for the identified character start column. After identifying the character start column, the processing circuitry 112 may consider the next pixel column in the text chunk, if any remain (1014). If so, the processing circuitry 112 may set the next pixel column as the current column for determining a character end column (1016). The processing circuitry 112 may identify a character end column when a current pixel column meets any number of character end column criteria. In particular, the processing circuitry 112 may, for example, identify a character end column by determining whether the number or proportion of white pixels in the current pixel column exceeds a white column threshold (1018). If not, the processing circuitry 112 may consider the next pixel column in the text chunk for determining a character end column, if any remain (1014).


When the number or proportion of white pixels in the current pixel column exceeds the white column threshold, the processing circuitry 112 identifies this particular pixel column as a corresponding character end column to the previously determined character start column (1020). The processing circuitry 112 may increment a counter indicating the character count for the text chunk.


The processing circuitry 112 may continue processing the text chunk to determine character start columns and corresponding character end columns until no additional pixel columns remain (1006 or 1014). Then, the processing circuitry 112 may obtain the character count for the text chunk by reading the counter indicating the character count for the text chunk (1022).



FIG. 11 shows an example of logic 1100 that may be implemented in hardware, software, or both. For example, the processing circuitry 112 may implement the logic 1100 in software as the image processing instructions 122.


The processing circuitry 112 may read the character count parameters 124 (1102) and receive a financial document image 200 (1104). In some implementations, the character count parameters 124 may specify a character count threshold for the financial document image 200. The character count threshold may specify a minimum or maximum threshold number of characters in a financial document image 200 to meet particular quality criteria for processing the financial document image 200. Additionally, the character count parameters 124 may specify a particular character count threshold for different types of images, such as specific character count thresholds for business checks, personal checks, financial forms, remittance coupons, etc. As illustrative examples, the character count threshold for a business check may be set to 50 characters for personal checks and 100 characters for business checks. The character count parameters 124 may additionally or alternatively specify a particular character count threshold for particular regions (e.g., interest regions) of the financial document image 200 or any other image type the processing circuitry 112 may process.


The processing circuitry 112 may optionally perform image pre-processing techniques on the financial document image 200 (1106), including any of the pre-processing techniques described above. The processing circuitry 112 may determine a character count for the financial document image 200, for example by identifying one or more text chunks in the financial document image 200 (1008) and determining a character count for one or more of the identified text chunks (1110). To do so, the processing circuitry 112 may utilize any combination of the methods, flows, and techniques described above. The processing circuitry 112 may determine a character count for the financial document image 200 by summing the determined character count of text chunks in the financial document image 200.


The processing circuitry 112 may determine whether the character count for the financial document image 200 meets the character count criteria (1112). When the character count for the financial document image 200 fails the character count criteria, the processing circuitry 112 may instruct recapture of the financial document image (1114). For example, the processing circuitry 112 may send an image rejection message to an electronic device 102 used to capture the financial document image 200. The image rejection message may further instruct a user to recapture the image of financial document.


When the character count for the financial document image 200 meets the character count criteria, the processing circuitry 112 may perform further image processing. For example, the processing circuitry 112 may perform character recognition (e.g., OCR) on the financial document image 200 to recognize the characters on the financial document image 200. The processing circuitry 112 may perform further processing after character recognition, such as initiating a deposit process of a check represented by the financial document image 200, processing of a medical bill or financial form, etc.


The character count criteria may serve as an initial quality screen for incoming images received by the processing circuitry 112. By determining the character count of an image prior to performing subsequent image processing, the processing circuitry 112 may determine that the image is not overly cropped, and thus containing a character count less than a minimum threshold. Similarly, the character count criteria may be configured to prevent processing of overly blurry images, e.g., blurry images such that the processing circuitry 112 cannot determine enough character start and end columns and resulting in a character count less than a minimum threshold.


As discussed above, the character count parameters 124 may specify particular character count thresholds for interest regions of an image. Accordingly, the processing circuitry 112 may specifically identify text chunks and determine character counts for these interest regions instead of for the entire image. The processing circuitry 112 may determine the image passes the character count criteria when some or all of the particular character count criteria for the determined interest regions are met. As one example, the processing circuitry 112 may identify a MICR line portion of a check image as an interest region, and apply particular character count criteria for the MICR line portion, e.g., a minimum character count threshold. Additional exemplary interest regions may include high priority fields of a document, such as a social security number field, name field, address field, courtesy amount field, or any high priority region of an image received by the processing circuitry.


By performing combinations of the methods and techniques described above, the processing circuitry 112 may identify the character count for a financial document image 200 or other image without recognizing any particular character in the financial document image 200.


Although the example of a financial document 200 such as a check is provided by way of example above, the techniques discussed for identifying the presence, but not the specific identity or literal meaning, of characters or words, may be applied to any type of document. Other documents that may be analyzed with the techniques described herein include receipts, insurance documents, coupons, and so on. Specific portions of these documents may be targeted, or only characters of a particular font size may be included, for a given type of document. An advantage of the techniques discussed above is that the processing power and time for recognizing the presence, but not the specific identity, of characters or words may be less than that needed for actually identifying the individual letter, number or symbol. In other words, the knowledge that the captured image has chunks of text with a likelihood of four characters may be used rather than identifying those four characters as “abc3’ can provide a helpful filter for a system to determine if an expected type of document is being looked at. In this way, a system may quickly, and with less processing power, filter out unacceptable (e.g., overly cropped or blurry) documents.


In some implementations, an image processing system 104 may implement the processing circuitry 112 for performing any of the methods and techniques described above, including determining a character count for a financial document image 200 without recognizing any particular character in the financial document image 200. In other implementations, an electronic device 102, such as a mobile device, may implement the processing circuitry 112. In yet other implementations, the functionality of the processing circuitry 112 may be implemented, e.g., distributed, through a combination of the image processing system 104 and electronic device 102.


The methods, devices, and logic described above may be implemented in many different ways in many different combinations of hardware, software or both hardware and software. For example, all or parts of the system may include circuitry in a controller, a microprocessor, or an application specific integrated circuit (ASIC), or may be implemented with discrete logic or components, or a combination of other types of analog or digital circuitry, combined on a single integrated circuit or distributed among multiple integrated circuits. All or part of the logic described above may be implemented as instructions for execution by a processor, controller, or other processing device and may be stored in a tangible or non-transitory machine-readable or computer-readable medium such as flash memory, random access memory (RAM) or read only memory (ROM), erasable programmable read only memory (EPROM) or other machine-readable medium such as a compact disc read only memory (CDROM), or magnetic or optical disk. Thus, a product, such as a computer program product, may include a storage medium and computer readable instructions stored on the medium, which when executed in an endpoint, computer system, or other device, cause the device to perform operations according to any of the description above.


The processing capability described above may be distributed among multiple system components, such as among multiple processors and memories, optionally including multiple distributed processing systems. Parameters, databases, and other data structures may be separately stored and managed, may be incorporated into a single memory or database, may be logically and physically organized in many different ways, and may implemented in many ways, including data structures such as linked lists, hash tables, or implicit storage mechanisms. Programs may be parts (e.g., subroutines) of a single program, separate programs, distributed across several memories and processors, or implemented in many different ways, such as in a library, such as a shared library (e.g., a dynamic link library (DLL)). The DLL, for example, may store code that performs any of the system processing described above. While various embodiments of the systems and methods have been described, it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the systems and methods. Accordingly, the systems and methods are not to be restricted except in light of the attached claims and their equivalents.

Claims
  • 1. A method comprising: applying, by a processor, an image pre-processing function to a digital image depicting a document;storing, in a memory in communication with the processor, a modified digital image based on the image pre-processing function applied to the digital image;converting, by the processor, an image chunk in the modified digital image into a pixel array representation comprised of first color pixels or second color pixels; anddetermining, by the processor, a character count for the image chunk based on a number of first color pixels in the image chunk and a number of second color pixels in the image chunk.
  • 2. The method of claim 1, wherein determining the character count for the image chunk comprises: starting at a leftmost column in the image chunk and traversing sequentially to an adjacent right-hand column in the image chunk, counting the number of first color pixels in each column;identifying a character start column when the number of second color pixels in a current column exceeds a second color pixel column threshold;identifying a character end column when the number of first color pixels in the current column exceeds a first color pixel column threshold; andincreasing a character count value for each identified character end column having a corresponding character start column in the image chunk.
  • 3. The method of claim 1, wherein identifying the image chunk in the modified digital image comprises: determining a first pixel in the modified digital image as a top left pixel of the image chunk based on a top left score of the first pixel; anddetermining a second pixel of the modified digital image as a top right pixel of the image chunk based on a top right score of the second pixel; anddetermining a row of pixels in the modified digital image as a bottom edge of the image chunk.
  • 4. The method of claim 3, where determining the row of pixels in the modified digital image as the bottom edge of the image chunk comprises: identifying a particular row of pixels with a proportion of first color pixels exceeding a bottom edge threshold; andidentifying the particular pixel row as the bottom edge of the image chunk.
  • 5. The method of claim 1, wherein the image pre-processing function is at least one of an image binarizing function, an image de-skewing function, an image contrast adjustment function, an image cleaning function, an image de-blurring function, or an image size adjustment function.
  • 6. The method of claim 1, further comprising: receiving, by the processor, a character count parameter including dimension information for the image chunk; andwherein identifying the image chunk in the modified digital image comprises identifying the image chunk according to the dimension information.
  • 7. The method of claim 1, further comprising: determining a quality value assigned to the modified digital image.
  • 8. The method of claim 1, further comprising: identifying an image chunk extension when at least a threshold number of pixel columns are determined to be comprised of all first color pixels adjacent to an edge of the image chunk within the modified digital image.
  • 9. A computing device system comprising: a memory configured to store a digital image depicting a document; anda processor in communication with the memory, wherein the processor is configured to: apply an image pre-processing function to the digital image;store a modified digital image based on the image pre-processing function applied to the digital image;convert an image chunk in the modified digital image into a pixel array representation comprised of first color pixels or second color pixels; anddetermine a character count for the image chunk based on a number of first color pixels in the image chunk and a number of second color pixels in the image chunk.
  • 10. The computing device system of claim 9, wherein the processor, to determine the character count for the image chunk, is configured to: start at a leftmost column in the image chunk and traverse sequentially to an adjacent right-hand column in the image chunk, count the number of first color pixels in each column;identify a character start column when the number of second color pixels in a current column exceeds a second color pixel column threshold;identify a character end column when the number of first color pixels in the current column exceeds a first color pixel column threshold; andincrease a character count value for each identified character end column having a corresponding character start column in the image chunk.
  • 11. The computing device system of claim 9, wherein the processor, to identify the image chunk in the modified digital image, is configured to: determine a first pixel in the modified digital image as a top left pixel of the image chunk based on a top left score of the first pixel;determine a second pixel of the modified digital image as a top right pixel of the image chunk based on a top right score of the second pixel; anddetermine a row of pixels in the modified digital image as a bottom edge of the image chunk.
  • 12. The computing device system of claim 11, wherein the processor, to determine the row of pixels in the modified digital image as the bottom edge of the image chunk, is configured to: identify a particular row of pixels with a proportion of first color pixels exceeding a bottom edge threshold; andidentify the particular pixel row as the bottom edge of the image chunk.
  • 13. The computing device system of claim 9, wherein the image pre-processing function is at least one of an image binarizing function, an image de-skewing function, an image contrast adjustment function, an image cleaning function, an image de-blurring function, or an image size adjustment function.
  • 14. The computing device system of claim 9, wherein the processor is further configured to: receive a character count parameter including dimension information for the image chunk; andidentify the image chunk in the modified digital image comprises identifying the image chunk according to the dimension information.
  • 15. The computing device system of claim 9, wherein the processor is further configured to determine a quality value assigned to the modified digital image based on the character count value.
  • 16. The computing device system of claim 9, wherein the processor is further configured to: identify an image chunk extension when at least a threshold number of pixel columns are determined to be comprised of all first color pixels adjacent to an edge of the image chunk within the modified digital image.
  • 17. A device comprising: a machine-readable medium, other than a transitory signal; andinstructions stored on the machine-readable medium, the instructions configured to, when executed by a processor, cause the processor to: apply an image pre-processing function to a digital image;store a modified digital image based on the image pre-processing function applied to the digital image;convert an image chunk in the modified digital image into a pixel array representation comprised of first color pixels or second color pixels; anddetermine a character count for the image chunk based on a number of first color pixels in the image chunk and a number of second color pixels in the image chunk.
  • 18. The device of claim 17, wherein the instructions, when executed by the processor, cause the processor to: start at a leftmost column in the image chunk and traverse sequentially to an adjacent right-hand column in the image chunk, count the number of first color pixels in each column;identify a character start column when the number of second color pixels in a current column exceeds a second color pixel column threshold;identify a character end column when the number of first color pixels in the current column exceeds a first color pixel column threshold; andincrease a character count value for each identified character end column having a corresponding character start column in the image chunk.
  • 19. The device of claim 17, wherein the instructions, when executed by the processor, cause the processor to: determine a first pixel in the modified digital image as a top left pixel of the image chunk based on a top left score of the first pixel; anddetermine a second pixel of the modified digital image as a top right pixel of the image chunk based on a top right score of the second pixel; anddetermine a row of pixels in the modified digital image as a bottom edge of the image chunk.
  • 20. The device of claim 17, wherein the image pre-processing function is at least one of an image binarizing function, an image de-skewing function, an image contrast adjustment function, an image cleaning function, an image de-blurring function, or an image size adjustment function.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application is a continuation of U.S. patent application Ser. No. 16/368,202, filed Mar. 28, 2019, which is a continuation of Ser. No. 15/884,990, filed on Jan. 31, 2018, (now U.S. Pat. No. 10,360,448 issued on Jul. 2, 2019), which is a continuation of U.S. patent application Ser. No. 15/014,918, filed on Feb. 3, 2016 (now U.S. Pat. No. 9,904,848 issued Feb. 27, 2018), which is a continuation of U.S. patent application Ser. No. 14/056,565 (now U.S. Pat. No. 9,286,514 issued on Mar. 15, 2016), filed on Oct. 17, 2013, the entirety of all of which are incorporated by reference herein.

US Referenced Citations (1158)
Number Name Date Kind
1748489 McCarthy et al. Feb 1930 A
2292825 Dilks et al. Aug 1942 A
3005282 Christiansen Oct 1961 A
3341820 Grillmeier, Jr. et al. Sep 1967 A
3576972 Wood May 1971 A
3593913 Bremer Jul 1971 A
3620553 Donovan Nov 1971 A
3648242 Grosbard Mar 1972 A
3800124 Walsh Mar 1974 A
3816943 Henry Jun 1974 A
4002356 Weidmann Jan 1977 A
4027142 Paup et al. May 1977 A
4060711 Buros Nov 1977 A
4070649 Wright, Jr. et al. Jan 1978 A
4128202 Buros Dec 1978 A
4136471 Austin Jan 1979 A
4205780 Burns Jun 1980 A
4264808 Owens Apr 1981 A
4305216 Skelton Dec 1981 A
4321672 Braun Mar 1982 A
4346442 Musmanno Aug 1982 A
4417136 Rushby et al. Nov 1983 A
4433436 Carnes Feb 1984 A
4454610 Sziklai Jun 1984 A
RE31692 Tyburski et al. Oct 1984 E
4523330 Grosbard Jun 1985 A
4636099 Goldston Jan 1987 A
4640413 Kaplan Feb 1987 A
4644144 Chandek Feb 1987 A
4722444 Murphy et al. Feb 1988 A
4722544 Weber Feb 1988 A
4727435 Otani Feb 1988 A
4737911 Freeman Apr 1988 A
4739411 Bolton Apr 1988 A
4774574 Daly et al. Sep 1988 A
4774663 Musmanno Sep 1988 A
4790475 Griffin Dec 1988 A
4806780 Yamamoto Feb 1989 A
4837693 Schotz Jun 1989 A
4890228 Longfield Dec 1989 A
4896363 Taylor et al. Jan 1990 A
4927071 Wood May 1990 A
4934587 McNabb Jun 1990 A
4960981 Benton Oct 1990 A
4975735 Bright Dec 1990 A
5022683 Barbour Jun 1991 A
5077805 Tan Dec 1991 A
5091968 Higgins et al. Feb 1992 A
5122950 Benton et al. Jun 1992 A
5134564 Dunn et al. Jul 1992 A
5146606 Grondalski Sep 1992 A
5157620 Shaar Oct 1992 A
5159548 Caslavka Oct 1992 A
5164833 Aoki Nov 1992 A
5175682 Higashiyama et al. Dec 1992 A
5187750 Behera Feb 1993 A
5191525 LeBrun Mar 1993 A
5193121 Elischer et al. Mar 1993 A
5220501 Lawlor Jun 1993 A
5227863 Bilbrey et al. Jul 1993 A
5229589 Schneider Jul 1993 A
5233547 Kapp et al. Aug 1993 A
5237158 Kern et al. Aug 1993 A
5237159 Stephens Aug 1993 A
5237620 Deaton et al. Aug 1993 A
5257320 Etherington et al. Oct 1993 A
5265008 Benton Nov 1993 A
5268968 Yoshida Dec 1993 A
5321816 Rogan Jun 1994 A
5345090 Hludzinski Sep 1994 A
5347302 Simonoff Sep 1994 A
5350906 Brody Sep 1994 A
5373550 Campbell Dec 1994 A
5383113 Kight et al. Jan 1995 A
5419588 Wood May 1995 A
5422467 Graef Jun 1995 A
5444616 Nair et al. Aug 1995 A
5444794 Uhland, Sr. Aug 1995 A
5455875 Chevion et al. Oct 1995 A
5475403 Havlovick et al. Dec 1995 A
5504538 Tsujihara Apr 1996 A
5504677 Pollin Apr 1996 A
5528387 Kelly et al. Jun 1996 A
5530773 Thompson Jun 1996 A
5577179 Blank Nov 1996 A
5583759 Geer Dec 1996 A
5590196 Moreau Dec 1996 A
5594225 Botvin Jan 1997 A
5598969 Ong Feb 1997 A
5602936 Green Feb 1997 A
5610726 Nonoshita Mar 1997 A
5611028 Shibasaki Mar 1997 A
5630073 Nolan May 1997 A
5631984 Graf et al. May 1997 A
5668897 Stolfo Sep 1997 A
5673320 Ray et al. Sep 1997 A
5677955 Doggett Oct 1997 A
5678046 Cahill et al. Oct 1997 A
5679938 Templeton Oct 1997 A
5680611 Rail Oct 1997 A
5691524 Josephson Nov 1997 A
5699452 Vaidyanathan Dec 1997 A
5734747 Vaidyanathan Mar 1998 A
5737440 Kunkler Apr 1998 A
5748780 Stolfo May 1998 A
5751842 Riach May 1998 A
5761686 Bloomberg Jun 1998 A
5784503 Bleecker, III et al. Jul 1998 A
5830609 Warner Nov 1998 A
5832463 Funk Nov 1998 A
5838814 Moore Nov 1998 A
5848185 Koga et al. Dec 1998 A
5859935 Johnson et al. Jan 1999 A
5863075 Rich Jan 1999 A
5870456 Rogers Feb 1999 A
5870724 Lawlor Feb 1999 A
5870725 Bellinger et al. Feb 1999 A
5878337 Joao Mar 1999 A
5889884 Hashimoto et al. Mar 1999 A
5893101 Balogh et al. Apr 1999 A
5897625 Gustin Apr 1999 A
5898157 Mangili et al. Apr 1999 A
5901253 Tretter May 1999 A
5903878 Talati May 1999 A
5903881 Schrader May 1999 A
5903904 Peairs May 1999 A
5910988 Ballard Jun 1999 A
5917931 Kunkler Jun 1999 A
5924737 Schrupp Jul 1999 A
5926548 Okamoto Jul 1999 A
5930501 Neil Jul 1999 A
5930778 Geer Jul 1999 A
5937396 Konya Aug 1999 A
5940844 Cahill Aug 1999 A
5982918 Mennie Nov 1999 A
5987439 Gustin et al. Nov 1999 A
6005623 Takahashi Dec 1999 A
6012048 Gustin et al. Jan 2000 A
6014454 Kunkler Jan 2000 A
6021202 Anderson Feb 2000 A
6021397 Jones Feb 2000 A
6023705 Bellinger et al. Feb 2000 A
6029887 Furuhashi Feb 2000 A
6030000 Diamond Feb 2000 A
6032137 Ballard Feb 2000 A
6038553 Hyde Mar 2000 A
6044883 Noyes Apr 2000 A
6053405 Irwin, Jr. et al. Apr 2000 A
6059185 Funk et al. May 2000 A
6064753 Bolle et al. May 2000 A
6064762 Haenel May 2000 A
6072941 Suzuki et al. Jun 2000 A
6073119 Borenmisza-wahr Jun 2000 A
6073121 Ramzy Jun 2000 A
6085168 Mori Jul 2000 A
6086708 Colgate Jul 2000 A
6089450 Koeple Jul 2000 A
6089610 Greene Jul 2000 A
6092047 Hyman et al. Jul 2000 A
6097834 Krouse Aug 2000 A
6097845 Ng et al. Aug 2000 A
6097885 Rayner Aug 2000 A
6105865 Hardesty Aug 2000 A
6128603 Dent et al. Oct 2000 A
6141339 Kaplan et al. Oct 2000 A
6145738 Stinson et al. Nov 2000 A
6148102 Stolin Nov 2000 A
6149056 Stinson et al. Nov 2000 A
6151409 Chen et al. Nov 2000 A
6151423 Melen Nov 2000 A
6151426 Lee Nov 2000 A
6159585 Rittenhouse Dec 2000 A
6170744 Lee Jan 2001 B1
6178270 Taylor et al. Jan 2001 B1
6178409 Weber et al. Jan 2001 B1
6181837 Cahill et al. Jan 2001 B1
6188506 Kaiserman Feb 2001 B1
6189785 Lowery Feb 2001 B1
6192165 Irons Feb 2001 B1
6195452 Royer Feb 2001 B1
6195694 Chen et al. Feb 2001 B1
6199055 Kara Mar 2001 B1
6236009 Emigh et al. May 2001 B1
6243689 Norton Jun 2001 B1
6278983 Ball Aug 2001 B1
6282523 Tedesco et al. Aug 2001 B1
6282826 Richards Sep 2001 B1
6293469 Masson et al. Sep 2001 B1
6310647 Parulski et al. Oct 2001 B1
6314452 Dekel Nov 2001 B1
6315195 Ramachandran Nov 2001 B1
6317727 May Nov 2001 B1
6328207 Gregoire et al. Dec 2001 B1
6330546 Gopinathan et al. Dec 2001 B1
6339658 Moccagatta Jan 2002 B1
6339766 Gephart Jan 2002 B1
6351553 Hayosh Feb 2002 B1
6351735 Deaton et al. Feb 2002 B1
6354490 Weiss et al. Mar 2002 B1
6363162 Moed et al. Mar 2002 B1
6363164 Jones et al. Mar 2002 B1
6390362 Martin May 2002 B1
6397196 Kravetz May 2002 B1
6408084 Foley Jun 2002 B1
6411725 Rhoads Jun 2002 B1
6411737 Wesolkowski et al. Jun 2002 B2
6411938 Gates et al. Jun 2002 B1
6413305 Mehta Jul 2002 B1
6417869 Do Jul 2002 B1
6425017 Dievendorff Jul 2002 B1
6429952 Olbricht Aug 2002 B1
6439454 Masson et al. Aug 2002 B1
6449397 Che-chu Sep 2002 B1
6450403 Martens et al. Sep 2002 B1
6463220 Dance et al. Oct 2002 B1
6464134 Page Oct 2002 B1
6469745 Yamada et al. Oct 2002 B1
6470325 Leemhuis Oct 2002 B1
6473519 Pidhirny et al. Oct 2002 B1
6502747 Stoutenburg et al. Jan 2003 B1
6505178 Flenley Jan 2003 B1
6546119 Ciolli et al. Apr 2003 B2
6574377 Cahill et al. Jun 2003 B1
6574609 Downs Jun 2003 B1
6578760 Otto Jun 2003 B1
6587837 Spagna Jul 2003 B1
6606117 Windle Aug 2003 B1
6609200 Anderson Aug 2003 B2
6611598 Hayosh Aug 2003 B1
6614930 Agnihotri et al. Sep 2003 B1
6643416 Daniels Nov 2003 B1
6647136 Jones et al. Nov 2003 B2
6654487 Downs, Jr. Nov 2003 B1
6661910 Jones et al. Dec 2003 B2
6668372 Wu Dec 2003 B1
6669086 Abdi et al. Dec 2003 B2
6672452 Alves Jan 2004 B1
6682452 Quintus Jan 2004 B2
6695204 Stinson Feb 2004 B1
6697091 Rzepkowski et al. Feb 2004 B1
6711474 Treyz et al. Mar 2004 B1
6726097 Graef Apr 2004 B2
6728397 McNeal Apr 2004 B2
6738496 Van Hall May 2004 B1
6742128 Joiner May 2004 B1
6745186 Testa et al. Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6755340 Voss et al. Jun 2004 B1
6760414 Schurko et al. Jul 2004 B1
6760470 Bogosian et al. Jul 2004 B1
6763226 McZeal Jul 2004 B1
6781962 Williams Aug 2004 B1
6786398 Stinson et al. Sep 2004 B1
6789054 Makhlouf Sep 2004 B1
6796489 Slater et al. Sep 2004 B2
6796491 Nakajima Sep 2004 B2
6806903 Okisu et al. Oct 2004 B1
6807294 Yamazaki Oct 2004 B2
6813733 Li Nov 2004 B1
6829704 Zhang Dec 2004 B2
6844885 Anderson Jan 2005 B2
6856965 Stinson Feb 2005 B1
6863214 Garner et al. Mar 2005 B2
6870947 Kelland Mar 2005 B2
6873728 Bernstein et al. Mar 2005 B2
6883140 Acker Apr 2005 B1
6898314 Kung et al. May 2005 B2
6902105 Koakutsu Jun 2005 B2
6910023 Schibi Jun 2005 B1
6913188 Wong Jul 2005 B2
6922487 Dance et al. Jul 2005 B2
6931255 Mekuria Aug 2005 B2
6931591 Brown Aug 2005 B1
6934719 Nally Aug 2005 B2
6947610 Sun Sep 2005 B2
6957770 Robinson Oct 2005 B1
6961689 Greenberg Nov 2005 B1
6970843 Forte Nov 2005 B1
6973589 Wright Dec 2005 B2
6983886 Natsukari et al. Jan 2006 B2
6993507 Meyer Jan 2006 B2
6996263 Jones et al. Feb 2006 B2
6999943 Johnson Feb 2006 B1
7003040 Yi Feb 2006 B2
7004382 Sandru Feb 2006 B2
7010155 Koakutsu et al. Mar 2006 B2
7010507 Anderson Mar 2006 B1
7016704 Pallakoff Mar 2006 B2
7027171 Watanabe Apr 2006 B1
7028886 Maloney Apr 2006 B1
7039048 Monta May 2006 B1
7046991 Little May 2006 B2
7051001 Slater May 2006 B1
7058036 Yu Jun 2006 B1
7062099 Li et al. Jun 2006 B2
7062456 Riehl et al. Jun 2006 B1
7062768 Kubo Jun 2006 B2
7072862 Wilson Jul 2006 B1
7076458 Lawlor et al. Jul 2006 B2
7086003 Demsky Aug 2006 B2
7092561 Downs, Jr. Aug 2006 B2
7104443 Paul et al. Sep 2006 B1
7113925 Waserstein Sep 2006 B2
7114649 Nelson Oct 2006 B2
7116446 Maurer Oct 2006 B2
7117171 Pollin Oct 2006 B1
7120461 Cho Oct 2006 B2
7131571 Swift et al. Nov 2006 B2
7139594 Nagatomo Nov 2006 B2
7140539 Crews Nov 2006 B1
7163347 Lugg Jan 2007 B2
7178721 Maloney Feb 2007 B2
7181430 Buchanan et al. Feb 2007 B1
7184980 Allen-Rouman et al. Feb 2007 B2
7185805 McShirley Mar 2007 B1
7197173 Jones et al. Mar 2007 B2
7200255 Jones Apr 2007 B2
7204412 Foss, Jr. Apr 2007 B2
7207478 Blackson et al. Apr 2007 B1
7216106 Buchanan May 2007 B1
7219082 Forte May 2007 B2
7219831 Murata May 2007 B2
7240336 Baker Jul 2007 B1
7245765 Myers et al. Jul 2007 B2
7249076 Pendleton Jul 2007 B1
7252224 Verma Aug 2007 B2
7257246 Brodie et al. Aug 2007 B1
7266230 Doran Sep 2007 B2
7277191 Metcalfe et al. Oct 2007 B2
7290034 Budd Oct 2007 B2
7299970 Ching Nov 2007 B1
7299979 Phillips Nov 2007 B2
7313543 Crane Dec 2007 B1
7314163 Crews et al. Jan 2008 B1
7321874 Dilip Jan 2008 B2
7321875 Dilip Jan 2008 B2
7325725 Foss, Jr. Feb 2008 B2
7328190 Smith et al. Feb 2008 B2
7330604 Wu et al. Feb 2008 B2
7331523 Meier et al. Feb 2008 B2
7336813 Prakash et al. Feb 2008 B2
7343320 Treyz Mar 2008 B1
7349566 Jones et al. Mar 2008 B2
7349585 Li Mar 2008 B2
7350697 Swift et al. Apr 2008 B2
7356505 March Apr 2008 B2
7369713 Suino May 2008 B2
7377425 Ma May 2008 B1
7379978 Anderson May 2008 B2
7383227 Weinflash et al. Jun 2008 B2
7385631 Maeno Jun 2008 B2
7386511 Buchanan Jun 2008 B2
7388683 Rodriguez et al. Jun 2008 B2
7389912 Starrs Jun 2008 B2
7391897 Jones et al. Jun 2008 B2
7391934 Goodall et al. Jun 2008 B2
7392935 Byrne Jul 2008 B2
7401048 Rosedale Jul 2008 B2
7403917 Larsen Jul 2008 B1
7406198 Aoki et al. Jul 2008 B2
7419093 Blackson et al. Sep 2008 B1
7421107 Lugg Sep 2008 B2
7421410 Schechtman et al. Sep 2008 B1
7427016 Chimento Sep 2008 B2
7433098 Klein et al. Oct 2008 B2
7437327 Lam Oct 2008 B2
7440924 Buchanan Oct 2008 B2
7447347 Weber Nov 2008 B2
7455220 Phillips Nov 2008 B2
7455221 Sheaffer Nov 2008 B2
7460108 Tamura Dec 2008 B2
7460700 Tsunachima et al. Dec 2008 B2
7461779 Ramachandran Dec 2008 B2
7461780 Potts Dec 2008 B2
7464859 Hawkins Dec 2008 B1
7471818 Price Dec 2008 B1
7475040 Buchanan Jan 2009 B2
7477923 Wallmark Jan 2009 B2
7480382 Dunbar Jan 2009 B2
7480422 Ackley et al. Jan 2009 B2
7489953 Griffin Feb 2009 B2
7490242 Torres Feb 2009 B2
7497429 Reynders Mar 2009 B2
7503486 Ahles Mar 2009 B2
7505759 Rahman Mar 2009 B1
7506261 Statou Mar 2009 B2
7509287 Nutahara Mar 2009 B2
7512564 Geer Mar 2009 B1
7519560 Lam Apr 2009 B2
7520420 Phillips Apr 2009 B2
7520422 Robinson et al. Apr 2009 B1
7536354 deGroeve et al. May 2009 B1
7536440 Budd May 2009 B2
7539646 Gilder May 2009 B2
7540408 Levine Jun 2009 B2
7542598 Jones Jun 2009 B2
7545529 Borrey et al. Jun 2009 B2
7548641 Gilson et al. Jun 2009 B2
7566002 Love et al. Jul 2009 B2
7571848 Cohen Aug 2009 B2
7577614 Warren et al. Aug 2009 B1
7587066 Cordery et al. Sep 2009 B2
7587363 Cataline Sep 2009 B2
7590275 Clarke et al. Sep 2009 B2
7599543 Jones Oct 2009 B2
7599888 Manfre Oct 2009 B2
7602956 Jones Oct 2009 B2
7606762 Heit Oct 2009 B1
7609873 Foth et al. Oct 2009 B2
7609889 Guo et al. Oct 2009 B2
7619721 Jones Nov 2009 B2
7620231 Jones Nov 2009 B2
7620604 Bueche, Jr. Nov 2009 B1
7630518 Frew et al. Dec 2009 B2
7644037 Ostrovsky Jan 2010 B1
7644043 Minowa Jan 2010 B2
7647275 Jones Jan 2010 B2
7668363 Price Feb 2010 B2
7672022 Fan Mar 2010 B1
7672940 Viola Mar 2010 B2
7676409 Ahmad Mar 2010 B1
7680732 Davies et al. Mar 2010 B1
7680735 Loy Mar 2010 B1
7689482 Lam Mar 2010 B2
7697776 Wu et al. Apr 2010 B2
7698222 Bueche, Jr. Apr 2010 B1
7702588 Gilder et al. Apr 2010 B2
7714778 Dupray May 2010 B2
7720735 Anderson et al. May 2010 B2
7734545 Fogliano Jun 2010 B1
7743979 Fredman Jun 2010 B2
7753268 Robinson et al. Jul 2010 B1
7761358 Craig et al. Jul 2010 B2
7766244 Field Aug 2010 B1
7769650 Bleunven Aug 2010 B2
7778457 Nepomniachtchi et al. Aug 2010 B2
7792752 Kay Sep 2010 B1
7792753 Slater et al. Sep 2010 B1
7793833 Yoon et al. Sep 2010 B2
7810714 Murata Oct 2010 B2
7812986 Graham et al. Oct 2010 B2
7818245 Prakash et al. Oct 2010 B2
7831458 Neumann Nov 2010 B2
7856402 Kay Dec 2010 B1
7865384 Anderson et al. Jan 2011 B2
7873200 Oakes, III et al. Jan 2011 B1
7876949 Oakes, III et al. Jan 2011 B1
7885451 Walls et al. Feb 2011 B1
7885880 Prasad et al. Feb 2011 B1
7894094 Nacman et al. Feb 2011 B2
7895054 Slen et al. Feb 2011 B2
7896232 Prasad et al. Mar 2011 B1
7900822 Prasad et al. Mar 2011 B1
7903863 Jones et al. Mar 2011 B2
7904386 Kalra et al. Mar 2011 B2
7912785 Kay Mar 2011 B1
7935441 Tononishi May 2011 B2
7949587 Morris et al. May 2011 B1
7950698 Popadic et al. May 2011 B2
7953441 Lors May 2011 B2
7958053 Stone Jun 2011 B2
7962411 Prasad et al. Jun 2011 B1
7970677 Oakes, III et al. Jun 2011 B1
7974899 Prasad et al. Jul 2011 B1
7978900 Nepomniachtchi et al. Jul 2011 B2
7979326 Kurushima Jul 2011 B2
7987231 Karkanias Jul 2011 B2
7996312 Beck et al. Aug 2011 B1
7996314 Smith et al. Aug 2011 B1
7996315 Smith et al. Aug 2011 B1
7996316 Smith et al. Aug 2011 B1
8000514 Nepomniachtchi et al. Aug 2011 B2
8001051 Smith et al. Aug 2011 B1
8045784 Price et al. Oct 2011 B2
8046301 Smith et al. Oct 2011 B1
8060442 Hecht et al. Nov 2011 B1
8065307 Haslam et al. Nov 2011 B2
8091778 Block et al. Jan 2012 B1
8116533 Kiplinger et al. Feb 2012 B2
8159520 Dhanoa Apr 2012 B1
8203640 Kim et al. Jun 2012 B2
8204293 Csulits et al. Jun 2012 B2
8235284 Prasad et al. Aug 2012 B1
8266076 Lopez et al. Sep 2012 B2
8271385 Emerson et al. Sep 2012 B2
8290237 Burks et al. Oct 2012 B1
8313020 Ramachandran Nov 2012 B2
8320657 Burks et al. Nov 2012 B1
8332329 Thiele Dec 2012 B1
8341077 Nichols et al. Dec 2012 B1
8351677 Oakes, III et al. Jan 2013 B1
8351678 Medina, III Jan 2013 B1
8358826 Medina et al. Jan 2013 B1
8364563 Choiniere, Sr. Jan 2013 B2
8369650 Zanfir et al. Feb 2013 B2
8374963 Billman Feb 2013 B1
8391599 Medina, III Mar 2013 B1
8392332 Oakes, III et al. Mar 2013 B1
8401962 Bent et al. Mar 2013 B1
8422758 Bueche, Jr. Apr 2013 B1
8433127 Harpel et al. Apr 2013 B1
8433647 Yarbrough Apr 2013 B1
8452689 Medina, III May 2013 B1
8464933 Prasad et al. Jun 2013 B1
8531518 Zomet Sep 2013 B1
8538124 Harpel et al. Sep 2013 B1
8542921 Medina Sep 2013 B1
8548267 Yacoub et al. Oct 2013 B1
8559766 Tilt et al. Oct 2013 B2
8582862 Nepomniachtchi et al. Nov 2013 B2
8611635 Medina, III Dec 2013 B1
8660952 Viera et al. Feb 2014 B1
8699779 Prasad et al. Apr 2014 B1
8708227 Oakes, III et al. Apr 2014 B1
8731321 Fujiwara et al. May 2014 B2
8732081 Oakes, III et al. May 2014 B1
8751345 Borzych et al. Jun 2014 B1
8751356 Garcia Jun 2014 B1
8751379 Bueche, Jr. Jun 2014 B1
8799147 Walls et al. Aug 2014 B1
8818033 Liu Aug 2014 B1
8837806 Ethington et al. Sep 2014 B1
8843405 Hartman et al. Sep 2014 B1
8929640 Mennie Jan 2015 B1
8959033 Oakes, III et al. Feb 2015 B1
8977571 Bueche, Jr. et al. Mar 2015 B1
8990862 Smith Mar 2015 B1
9009071 Watson et al. Apr 2015 B1
9036040 Danko May 2015 B1
9058512 Medina, III Jun 2015 B1
9064284 Janiszeski et al. Jun 2015 B1
9129340 Medina, III et al. Aug 2015 B1
9159101 Pollack et al. Oct 2015 B1
9177197 Prasad et al. Nov 2015 B1
9177198 Prasad et al. Nov 2015 B1
9224136 Oakes, III et al. Dec 2015 B1
9270804 Dees et al. Feb 2016 B2
9286514 Newman Mar 2016 B1
9311634 Hildebrand Apr 2016 B1
9336517 Prasad et al. May 2016 B1
9384409 Ming Jul 2016 B1
9390339 Danko Jul 2016 B1
9401011 Medina, III et al. Jul 2016 B2
9424569 Sherman et al. Aug 2016 B1
9569756 Bueche, Jr. et al. Feb 2017 B1
9613467 Roberts et al. Apr 2017 B2
9613469 Fish et al. Apr 2017 B2
9619872 Medina, III et al. Apr 2017 B1
9626183 Smith et al. Apr 2017 B1
9626662 Prasad et al. Apr 2017 B1
9779392 Prasad et al. Oct 2017 B1
9779452 Medina et al. Oct 2017 B1
9785929 Watson et al. Oct 2017 B1
9792654 Limas et al. Oct 2017 B1
9818090 Bueche, Jr. et al. Nov 2017 B1
9824453 Collins et al. Nov 2017 B1
9886642 Danko Feb 2018 B1
9892454 Pollack et al. Feb 2018 B1
9898778 Pollack et al. Feb 2018 B1
9898808 Medina, III et al. Feb 2018 B1
9904848 Newman Feb 2018 B1
9946923 Medina Apr 2018 B1
10013605 Oakes, III et al. Jul 2018 B1
10013681 Oakes, III et al. Jul 2018 B1
10157326 Long Dec 2018 B2
10181087 Danko Jan 2019 B1
10235660 Bueche, Jr. et al. Mar 2019 B1
10325420 Moon Jun 2019 B1
10354235 Medina Jul 2019 B1
10360448 Newman Jul 2019 B1
10373136 Pollack et al. Aug 2019 B1
10380559 Oakes, III et al. Aug 2019 B1
10380562 Prasad et al. Aug 2019 B1
10380565 Prasad Aug 2019 B1
10380683 Voutour et al. Aug 2019 B1
10380993 Clauer Salyers Aug 2019 B1
10402638 Oaks, III et al. Sep 2019 B1
10402790 Clark et al. Sep 2019 B1
10574879 Prasad et al. Feb 2020 B1
10621559 Oakes, III et al. Apr 2020 B1
10621660 Medina et al. Apr 2020 B1
10713629 Medina, III Jul 2020 B1
10719815 Oakes, III et al. Jul 2020 B1
10769598 Oakes, III et al. Sep 2020 B1
10818282 Clauer Salyers Oct 2020 B1
10956879 Eidson Mar 2021 B1
11030752 Backlund Jun 2021 B1
11042940 Limas Jun 2021 B1
11042941 Limas Jun 2021 B1
11062130 Medina, III Jul 2021 B1
11062131 Medina, III Jul 2021 B1
11062283 Prasad Jul 2021 B1
11064111 Prasad Jul 2021 B1
11068976 Voutour Jul 2021 B1
11070868 Mortensen Jul 2021 B1
20010004235 Maloney Jun 2001 A1
20010014881 Drummond Aug 2001 A1
20010016084 Pollard et al. Aug 2001 A1
20010018739 Anderson Aug 2001 A1
20010027994 Hayashida Oct 2001 A1
20010030695 Prabhu et al. Oct 2001 A1
20010037299 Nichols et al. Nov 2001 A1
20010042171 Vermeulen Nov 2001 A1
20010042785 Walker Nov 2001 A1
20010043748 Wesolkowski et al. Nov 2001 A1
20010047330 Gephart Nov 2001 A1
20010051965 Guillevic Dec 2001 A1
20010054020 Barth et al. Dec 2001 A1
20020001393 Jones Jan 2002 A1
20020013767 Katz Jan 2002 A1
20020016763 March Feb 2002 A1
20020016769 Barbara et al. Feb 2002 A1
20020023055 Antognini et al. Feb 2002 A1
20020025085 Gustafson et al. Feb 2002 A1
20020026418 Koppel et al. Feb 2002 A1
20020032656 Chen Mar 2002 A1
20020038289 Lawlor et al. Mar 2002 A1
20020040340 Yoshida Apr 2002 A1
20020052841 Guthrie May 2002 A1
20020052853 Munoz May 2002 A1
20020065786 Martens et al. May 2002 A1
20020072974 Pugliese Jun 2002 A1
20020075524 Blair Jun 2002 A1
20020084321 Martens Jul 2002 A1
20020087467 Mascavage, III et al. Jul 2002 A1
20020107767 McClair et al. Aug 2002 A1
20020107809 Biddle et al. Aug 2002 A1
20020116329 Serbetcioglu Aug 2002 A1
20020116335 Star Aug 2002 A1
20020118891 Rudd Aug 2002 A1
20020120562 Opiela Aug 2002 A1
20020120582 Elston et al. Aug 2002 A1
20020120846 Stewart et al. Aug 2002 A1
20020129249 Maillard et al. Sep 2002 A1
20020130868 Smith Sep 2002 A1
20020133409 Sawano et al. Sep 2002 A1
20020138445 Laage et al. Sep 2002 A1
20020138522 Muralidhar Sep 2002 A1
20020145035 Jones Oct 2002 A1
20020147798 Huang Oct 2002 A1
20020150279 Scott Oct 2002 A1
20020150311 Lynn Oct 2002 A1
20020152160 Allen-Rouman et al. Oct 2002 A1
20020152161 Aoike Oct 2002 A1
20020152164 Dutta Oct 2002 A1
20020152165 Dutta et al. Oct 2002 A1
20020152169 Dutta et al. Oct 2002 A1
20020152170 Dutta Oct 2002 A1
20020153414 Stoutenburg et al. Oct 2002 A1
20020154127 Vienneau et al. Oct 2002 A1
20020154815 Mizutani Oct 2002 A1
20020159648 Alderson et al. Oct 2002 A1
20020169715 Ruth et al. Nov 2002 A1
20020171820 Okamura Nov 2002 A1
20020172516 Aoyama Nov 2002 A1
20020178112 Goeller Nov 2002 A1
20020186881 Li Dec 2002 A1
20020188564 Star Dec 2002 A1
20020195485 Pomerleau et al. Dec 2002 A1
20030005326 Flemming Jan 2003 A1
20030009420 Jones Jan 2003 A1
20030015583 Abdi et al. Jan 2003 A1
20030018897 Bellis, Jr. et al. Jan 2003 A1
20030023557 Moore Jan 2003 A1
20030026609 Parulski Feb 2003 A1
20030038227 Sesek Feb 2003 A1
20030050889 Burke Mar 2003 A1
20030051138 Maeda et al. Mar 2003 A1
20030053692 Hong et al. Mar 2003 A1
20030055756 Allan Mar 2003 A1
20030055776 Samuelson Mar 2003 A1
20030072568 Lin et al. Apr 2003 A1
20030074315 Lam Apr 2003 A1
20030075596 Koakutsu Apr 2003 A1
20030075916 Gorski Apr 2003 A1
20030078883 Stewart et al. Apr 2003 A1
20030081824 Mennie May 2003 A1
20030086615 Dance et al. May 2003 A1
20030093367 Allen-Rouman et al. May 2003 A1
20030093369 Ijichi et al. May 2003 A1
20030097592 Adusumilli May 2003 A1
20030102714 Rhodes et al. Jun 2003 A1
20030105688 Brown et al. Jun 2003 A1
20030105714 Alarcon-Luther et al. Jun 2003 A1
20030126078 Vihinen Jul 2003 A1
20030126082 Omura et al. Jul 2003 A1
20030130940 Hansen et al. Jul 2003 A1
20030130958 Narayanan et al. Jul 2003 A1
20030132384 Sugiyama et al. Jul 2003 A1
20030133608 Bernstein et al. Jul 2003 A1
20030133610 Nagarajan et al. Jul 2003 A1
20030135457 Stewart et al. Jul 2003 A1
20030139999 Rowe Jul 2003 A1
20030159046 Choi et al. Aug 2003 A1
20030167225 Adams Sep 2003 A1
20030177448 Levine et al. Sep 2003 A1
20030187790 Swift et al. Oct 2003 A1
20030191615 Bailey Oct 2003 A1
20030191869 Williams Oct 2003 A1
20030200107 Allen et al. Oct 2003 A1
20030200174 Star Oct 2003 A1
20030202690 Jones et al. Oct 2003 A1
20030212904 Randle et al. Nov 2003 A1
20030213841 Josephson et al. Nov 2003 A1
20030217005 Drummond et al. Nov 2003 A1
20030218061 Filatov Nov 2003 A1
20030225705 Park et al. Dec 2003 A1
20030231285 Ferguson Dec 2003 A1
20030233278 Marshall Dec 2003 A1
20030233318 King et al. Dec 2003 A1
20040010466 Anderson Jan 2004 A1
20040010803 Berstis Jan 2004 A1
20040012496 De Souza Jan 2004 A1
20040013284 Yu Jan 2004 A1
20040017482 Weitman Jan 2004 A1
20040024626 Bruijning Feb 2004 A1
20040024708 Masuda Feb 2004 A1
20040029591 Chapman et al. Feb 2004 A1
20040030741 Wolton et al. Feb 2004 A1
20040044606 Buttridge et al. Mar 2004 A1
20040057697 Renzi Mar 2004 A1
20040058705 Morgan Mar 2004 A1
20040061913 Takiguchi Apr 2004 A1
20040066031 Wong Apr 2004 A1
20040066419 Pyhalammi Apr 2004 A1
20040069841 Wong Apr 2004 A1
20040071333 Douglas et al. Apr 2004 A1
20040075754 Nakajima et al. Apr 2004 A1
20040076320 Downs, Jr. Apr 2004 A1
20040078299 Down-Logan Apr 2004 A1
20040080795 Bean et al. Apr 2004 A1
20040089711 Sandru May 2004 A1
20040093303 Picciallo May 2004 A1
20040093305 Kight May 2004 A1
20040103057 Melbert et al. May 2004 A1
20040103296 Harp May 2004 A1
20040109596 Doran Jun 2004 A1
20040110975 Osinski et al. Jun 2004 A1
20040111371 Friedman Jun 2004 A1
20040117302 Weichert Jun 2004 A1
20040122754 Stevens Jun 2004 A1
20040133511 Smith et al. Jul 2004 A1
20040133516 Buchanan et al. Jul 2004 A1
20040138974 Shimamura Jul 2004 A1
20040148235 Craig et al. Jul 2004 A1
20040158549 Matena Aug 2004 A1
20040165096 Maeno Aug 2004 A1
20040170259 Park Sep 2004 A1
20040171371 Paul Sep 2004 A1
20040184766 Kim et al. Sep 2004 A1
20040201695 Inasaka Oct 2004 A1
20040201741 Ban Oct 2004 A1
20040202349 Erol et al. Oct 2004 A1
20040205459 Green Oct 2004 A1
20040210515 Hughes Oct 2004 A1
20040210523 Gains et al. Oct 2004 A1
20040217170 Takiguchi et al. Nov 2004 A1
20040225604 Foss, Jr. et al. Nov 2004 A1
20040228277 Williams Nov 2004 A1
20040236647 Acharya Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040238619 Nagasaka et al. Dec 2004 A1
20040240722 Tsuji et al. Dec 2004 A1
20040245324 Chen Dec 2004 A1
20040247199 Murai et al. Dec 2004 A1
20040248600 Kim Dec 2004 A1
20040252679 Williams Dec 2004 A1
20040260636 Marceau Dec 2004 A1
20040267665 Nam et al. Dec 2004 A1
20040267666 Minami Dec 2004 A1
20050001421 Luth et al. Jan 2005 A1
20050010108 Rahn et al. Jan 2005 A1
20050015332 Chen Jan 2005 A1
20050015341 Jackson Jan 2005 A1
20050015342 Murata et al. Jan 2005 A1
20050021466 Buchanan et al. Jan 2005 A1
20050030388 Stavely et al. Feb 2005 A1
20050033645 Duphily Feb 2005 A1
20050033685 Reyes Feb 2005 A1
20050033690 Antognini et al. Feb 2005 A1
20050033695 Minowa Feb 2005 A1
20050034046 Berkmann Feb 2005 A1
20050035193 Gustin et al. Feb 2005 A1
20050038746 Latimer et al. Feb 2005 A1
20050038754 Geist Feb 2005 A1
20050044042 Mendiola Feb 2005 A1
20050044577 Jerding Feb 2005 A1
20050049950 Johnson Mar 2005 A1
20050071283 Randle et al. Mar 2005 A1
20050075969 Nielson et al. Apr 2005 A1
20050075974 Turgeon Apr 2005 A1
20050077351 De Jong Apr 2005 A1
20050078336 Ferlitsch Apr 2005 A1
20050080725 Pick Apr 2005 A1
20050082364 Alvarez et al. Apr 2005 A1
20050086140 Ireland Apr 2005 A1
20050086168 Alvarez Apr 2005 A1
20050089209 Stefanuk Apr 2005 A1
20050091161 Gustin Apr 2005 A1
20050096992 Geisel May 2005 A1
20050097019 Jacobs May 2005 A1
20050097046 Singfield May 2005 A1
20050097050 Orcutt May 2005 A1
20050100216 Myers et al. May 2005 A1
20050108164 Salafia May 2005 A1
20050108168 Halpin May 2005 A1
20050115110 Dinkins Jun 2005 A1
20050125338 Tidwell et al. Jun 2005 A1
20050125360 Tidwell et al. Jun 2005 A1
20050127160 Fujikawa Jun 2005 A1
20050131820 Rodriguez Jun 2005 A1
20050143136 Lev et al. Jun 2005 A1
20050144131 Aziz Jun 2005 A1
20050149436 Elterich Jul 2005 A1
20050157174 Kitamura et al. Jul 2005 A1
20050017751 Hilt et al. Aug 2005 A1
20050168566 Tada Aug 2005 A1
20050171899 Dunn Aug 2005 A1
20050171907 Lewis Aug 2005 A1
20050177494 Kelly et al. Aug 2005 A1
20050177499 Thomas Aug 2005 A1
20050177518 Brown Aug 2005 A1
20050182710 Anderson Aug 2005 A1
20050188306 Mackenzie Aug 2005 A1
20050198364 del Val et al. Sep 2005 A1
20050203430 Williams et al. Sep 2005 A1
20050205660 Munte Sep 2005 A1
20050205661 Taylor Sep 2005 A1
20050209961 Michelsen Sep 2005 A1
20050213805 Blake et al. Sep 2005 A1
20050216409 McMonagle et al. Sep 2005 A1
20050216410 Davis et al. Sep 2005 A1
20050218209 Heilper et al. Oct 2005 A1
20050220324 Klein et al. Oct 2005 A1
20050228733 Bent et al. Oct 2005 A1
20050238257 Kaneda et al. Oct 2005 A1
20050244035 Klein et al. Nov 2005 A1
20050252955 Sugai Nov 2005 A1
20050267843 Acharya et al. Dec 2005 A1
20050268107 Harris et al. Dec 2005 A1
20050269412 Chiu Dec 2005 A1
20050273368 Hutten et al. Dec 2005 A1
20050278250 Zair Dec 2005 A1
20050281448 Lugg Dec 2005 A1
20050281450 Richardson Dec 2005 A1
20050281471 LeConte Dec 2005 A1
20050281474 Huang Dec 2005 A1
20050289030 Smith Dec 2005 A1
20050289059 Brewington et al. Dec 2005 A1
20050289182 Pandian et al. Dec 2005 A1
20060002426 Madour Jan 2006 A1
20060004660 Pranger Jan 2006 A1
20060015450 Guck et al. Jan 2006 A1
20060015733 O'Malley et al. Jan 2006 A1
20060017752 Kurzweil et al. Jan 2006 A1
20060025697 Kurzweil Feb 2006 A1
20060039628 Li et al. Feb 2006 A1
20060039629 Li et al. Feb 2006 A1
20060041506 Mason et al. Feb 2006 A1
20060045374 Kim et al. Mar 2006 A1
20060045379 Heaney, Jr. et al. Mar 2006 A1
20060047593 Naratil Mar 2006 A1
20060049242 Mejias et al. Mar 2006 A1
20060053056 Alspach-Goss Mar 2006 A1
20060059085 Tucker Mar 2006 A1
20060064368 Forte Mar 2006 A1
20060071950 Kurzweil et al. Apr 2006 A1
20060077941 Alagappan et al. Apr 2006 A1
20060080245 Bahl Apr 2006 A1
20060085357 Pizarro Apr 2006 A1
20060085516 Farr et al. Apr 2006 A1
20060102704 Reynders May 2006 A1
20060103893 Azimi et al. May 2006 A1
20060106691 Sheaffer May 2006 A1
20060106717 Randle May 2006 A1
20060108168 Fischer et al. May 2006 A1
20060110063 Weiss May 2006 A1
20060112013 Maloney May 2006 A1
20060115110 Rodriguez Jun 2006 A1
20060115141 Koakutsu et al. Jun 2006 A1
20060118613 McMann Jun 2006 A1
20060124730 Maloney Jun 2006 A1
20060144924 Stover Jul 2006 A1
20060144937 Heilper et al. Jul 2006 A1
20060144950 Johnson Jul 2006 A1
20060159367 Zeineh et al. Jul 2006 A1
20060161499 Rich et al. Jul 2006 A1
20060161501 Waserstein Jul 2006 A1
20060164682 Lev Jul 2006 A1
20060166178 Driedijk Jul 2006 A1
20060167818 Wentker et al. Jul 2006 A1
20060181614 Yen et al. Aug 2006 A1
20060182331 Gilson et al. Aug 2006 A1
20060182332 Weber Aug 2006 A1
20060186194 Richardson Aug 2006 A1
20060202014 VanKirk et al. Sep 2006 A1
20060206506 Fitzpatrick Sep 2006 A1
20060208059 Cable et al. Sep 2006 A1
20060210138 Hilton et al. Sep 2006 A1
20060212391 Norman et al. Sep 2006 A1
20060212393 Brown Sep 2006 A1
20060214940 Kinoshita Sep 2006 A1
20060215204 Miyamoto et al. Sep 2006 A1
20060215230 Borrey et al. Sep 2006 A1
20060221198 Fry et al. Oct 2006 A1
20060222260 Sambongi et al. Oct 2006 A1
20060229976 Jung Oct 2006 A1
20060229986 Corder Oct 2006 A1
20060229987 Leekley Oct 2006 A1
20060238503 Smith Oct 2006 A1
20060242062 Peterson Oct 2006 A1
20060242063 Peterson Oct 2006 A1
20060248009 Hicks et al. Nov 2006 A1
20060249567 Byrne Nov 2006 A1
20060255124 Hoch Nov 2006 A1
20060273165 Swift et al. Dec 2006 A1
20060274164 Kimura et al. Dec 2006 A1
20060279628 Fleming Dec 2006 A1
20060282383 Doran Dec 2006 A1
20060289630 Updike et al. Dec 2006 A1
20060291744 Ikeda et al. Dec 2006 A1
20070002157 Shintani et al. Jan 2007 A1
20070005467 Haigh et al. Jan 2007 A1
20070013721 Vau et al. Jan 2007 A1
20070016796 Singhal Jan 2007 A1
20070019243 Sato Jan 2007 A1
20070027802 VanDeburg et al. Feb 2007 A1
20070030357 Levien et al. Feb 2007 A1
20070030363 Cheatle et al. Feb 2007 A1
20070031022 Frew Feb 2007 A1
20070038561 Vancini et al. Feb 2007 A1
20070041629 Prakash et al. Feb 2007 A1
20070050292 Yarbrough Mar 2007 A1
20070053574 Verma et al. Mar 2007 A1
20070058851 Quine Mar 2007 A1
20070063016 Myatt Mar 2007 A1
20070064991 Douglas et al. Mar 2007 A1
20070065143 Didow et al. Mar 2007 A1
20070075772 Kokubo Apr 2007 A1
20070076940 Goodall et al. Apr 2007 A1
20070076941 Carreon et al. Apr 2007 A1
20070077921 Hayashi Apr 2007 A1
20070080207 Williams Apr 2007 A1
20070082700 Landschaft Apr 2007 A1
20070084911 Crowell Apr 2007 A1
20070086642 Foth Apr 2007 A1
20070086643 Spier Apr 2007 A1
20070094088 Mastie Apr 2007 A1
20070094140 Riney et al. Apr 2007 A1
20070100748 Dheer May 2007 A1
20070110277 Hayduchok et al. May 2007 A1
20070116364 Kleihorst et al. May 2007 A1
20070118472 Allen-Rouman et al. May 2007 A1
20070118747 Pintsov et al. May 2007 A1
20070122024 Haas et al. May 2007 A1
20070124241 Newton May 2007 A1
20070127805 Foth et al. Jun 2007 A1
20070129955 Dalmia Jun 2007 A1
20070130063 Jindia Jun 2007 A1
20070131758 Mejias et al. Jun 2007 A1
20070136198 Foth et al. Jun 2007 A1
20070138255 Carreon et al. Jun 2007 A1
20070140545 Rossignoli Jun 2007 A1
20070140594 Franklin Jun 2007 A1
20070143208 Varga Jun 2007 A1
20070150337 Hawkins et al. Jun 2007 A1
20070154098 Geva et al. Jul 2007 A1
20070156438 Popadic et al. Jul 2007 A1
20070168265 Rosenberger Jul 2007 A1
20070168283 Alvarez et al. Jul 2007 A1
20070171288 Inoue Jul 2007 A1
20070172107 Jones et al. Jul 2007 A1
20070172148 Hawley Jul 2007 A1
20070175977 Bauer et al. Aug 2007 A1
20070179883 Questembert Aug 2007 A1
20070183000 Eisen et al. Aug 2007 A1
20070183652 Backstrom et al. Aug 2007 A1
20070183741 Lerman et al. Aug 2007 A1
20070194102 Cohen Aug 2007 A1
20070198432 Pitroda et al. Aug 2007 A1
20070203708 Polycn et al. Aug 2007 A1
20070206877 Wu et al. Sep 2007 A1
20070208816 Baldwin et al. Sep 2007 A1
20070214086 Homoki Sep 2007 A1
20070217669 Swift et al. Sep 2007 A1
20070233525 Boyle Oct 2007 A1
20070233585 Ben Simon et al. Oct 2007 A1
20070235518 Mueller et al. Oct 2007 A1
20070235520 Smith et al. Oct 2007 A1
20070241179 Davis Oct 2007 A1
20070244782 Chimento Oct 2007 A1
20070246525 Smith et al. Oct 2007 A1
20070251992 Sharma et al. Nov 2007 A1
20070255652 Tumminaro Nov 2007 A1
20070255653 Tumminaro Nov 2007 A1
20070255662 Tumminaro Nov 2007 A1
20070258634 Simonoff Nov 2007 A1
20070262137 Brown Nov 2007 A1
20070262148 Yoon Nov 2007 A1
20070268540 Gaspardo et al. Nov 2007 A1
20070271182 Prakash et al. Nov 2007 A1
20070278286 Crowell et al. Dec 2007 A1
20070288380 Starrs Dec 2007 A1
20070288382 Narayanan et al. Dec 2007 A1
20070295803 Levine et al. Dec 2007 A1
20070299928 Kohli et al. Dec 2007 A1
20080002911 Eisen Jan 2008 A1
20080010204 Rackley, III et al. Jan 2008 A1
20080021802 Pendelton Jan 2008 A1
20080040280 Davis et al. Feb 2008 A1
20080046362 Easterly Feb 2008 A1
20080052182 Marshall Feb 2008 A1
20080059376 Davis Mar 2008 A1
20080063253 Wood Mar 2008 A1
20080065524 Matthews et al. Mar 2008 A1
20080068674 McIntyre Mar 2008 A1
20080069427 Liu Mar 2008 A1
20080071679 Foley Mar 2008 A1
20080071721 Wang Mar 2008 A1
20080073423 Heit et al. Mar 2008 A1
20080080760 Ronca Apr 2008 A1
20080086421 Gilder Apr 2008 A1
20080086770 Kulkarni et al. Apr 2008 A1
20080091599 Foss, Jr. Apr 2008 A1
20080097899 Jackson et al. Apr 2008 A1
20080097907 Till et al. Apr 2008 A1
20080103790 Abernethy May 2008 A1
20080103967 Ackert et al. May 2008 A1
20080113674 Baig May 2008 A1
20080114739 Hayes May 2008 A1
20080115066 Pavley et al. May 2008 A1
20080116257 Fickling May 2008 A1
20080117991 Peddireddy May 2008 A1
20080119178 Peddireddy May 2008 A1
20080133411 Jones et al. Jun 2008 A1
20080140552 Blaikie Jun 2008 A1
20080147549 Ruthbun Jun 2008 A1
20080155672 Sharma Jun 2008 A1
20080156438 Stumphauzer et al. Jul 2008 A1
20080162319 Breeden et al. Jul 2008 A1
20080162320 Mueller et al. Jul 2008 A1
20080162350 Allen-Rouman et al. Jul 2008 A1
20080162371 Rampell et al. Jul 2008 A1
20080177659 Lacey et al. Jul 2008 A1
20080180750 Feldman Jul 2008 A1
20080205751 Mischler Aug 2008 A1
20080208727 McLauqhlin et al. Aug 2008 A1
20080214180 Cunningham et al. Sep 2008 A1
20080219543 Csulits Sep 2008 A1
20080245869 Berkun et al. Oct 2008 A1
20080247629 Gilder Oct 2008 A1
20080247655 Yano Oct 2008 A1
20080249931 Gilder et al. Oct 2008 A1
20080249951 Gilder et al. Oct 2008 A1
20080262950 Christensen et al. Oct 2008 A1
20080262953 Anderson Oct 2008 A1
20080275821 Bishop et al. Nov 2008 A1
20080301441 Calman et al. Dec 2008 A1
20080304769 Hollander et al. Dec 2008 A1
20080316542 Mindrum et al. Dec 2008 A1
20090024520 Drory et al. Jan 2009 A1
20090046938 Yoder Feb 2009 A1
20090060396 Blessan et al. Mar 2009 A1
20090066987 Inokuchi Mar 2009 A1
20090076921 Nelson et al. Mar 2009 A1
20090092309 Calman et al. Apr 2009 A1
20090094148 Gilder et al. Apr 2009 A1
20090108080 Meyer Apr 2009 A1
20090110281 Hirabayashi Apr 2009 A1
20090114716 Ramachandran May 2009 A1
20090132813 Schibuk May 2009 A1
20090141962 Borgia et al. Jun 2009 A1
20090164350 Sorbe et al. Jun 2009 A1
20090164370 Sorbe et al. Jun 2009 A1
20090166406 Pigg et al. Jul 2009 A1
20090167870 Caleca et al. Jul 2009 A1
20090171795 Clouthier et al. Jul 2009 A1
20090171819 Emde et al. Jul 2009 A1
20090171825 Roman Jul 2009 A1
20090173781 Ramachadran Jul 2009 A1
20090185241 Nepomniachtchi Jul 2009 A1
20090185737 Nepomniachtchi Jul 2009 A1
20090185738 Nepomniachtchi Jul 2009 A1
20090190823 Walters Jul 2009 A1
20090192938 Amos Jul 2009 A1
20090212929 Drory et al. Aug 2009 A1
20090236413 Mueller et al. Sep 2009 A1
20090240620 Kendrick et al. Sep 2009 A1
20090252437 Li Oct 2009 A1
20090254447 Blades Oct 2009 A1
20090257641 Liu et al. Oct 2009 A1
20090263019 Tzadok et al. Oct 2009 A1
20090271287 Halpern Oct 2009 A1
20090281904 Pharris Nov 2009 A1
20090284637 Parulski et al. Nov 2009 A1
20090290751 Ferman et al. Nov 2009 A1
20090292628 Dryer et al. Nov 2009 A1
20090313167 Dujari et al. Dec 2009 A1
20090319425 Tumminaro et al. Dec 2009 A1
20090327129 Collas et al. Dec 2009 A1
20100007899 Lay Jan 2010 A1
20100008579 Smirnov Jan 2010 A1
20100016016 Brundage et al. Jan 2010 A1
20100027679 Sunahara et al. Feb 2010 A1
20100030687 Panthaki et al. Feb 2010 A1
20100047000 Park et al. Feb 2010 A1
20100057578 Blair et al. Mar 2010 A1
20100061446 Hands et al. Mar 2010 A1
20100078471 Lin et al. Apr 2010 A1
20100082468 Low et al. Apr 2010 A1
20100082470 Walach Apr 2010 A1
20100165015 Barkley et al. Jul 2010 A1
20100198733 Gantman et al. Aug 2010 A1
20100225773 Lee Sep 2010 A1
20100226559 Najari et al. Sep 2010 A1
20100260408 Prakash et al. Oct 2010 A1
20100262522 Anderson et al. Oct 2010 A1
20100274693 Bause et al. Oct 2010 A1
20100312705 Caruso et al. Dec 2010 A1
20110016084 Mundy et al. Jan 2011 A1
20110069180 Nijemcevic et al. Mar 2011 A1
20110106675 Perlman May 2011 A1
20110112967 Anderson et al. May 2011 A1
20110170740 Coleman Jul 2011 A1
20110191161 Dai Aug 2011 A1
20110251956 Cantley et al. Oct 2011 A1
20110276483 Saegert et al. Nov 2011 A1
20110280450 Nepomniachtchi et al. Nov 2011 A1
20110285874 Showering et al. Nov 2011 A1
20110310442 Popadic et al. Dec 2011 A1
20120045112 Lundblad et al. Feb 2012 A1
20120047070 Pharris Feb 2012 A1
20120062732 Marman et al. Mar 2012 A1
20120089514 Kraemling et al. Apr 2012 A1
20120099792 Chevion et al. Apr 2012 A1
20120185383 Atsmon Jul 2012 A1
20120185388 Pranger Jul 2012 A1
20120229872 Dolev Sep 2012 A1
20130021651 Popadic et al. Jan 2013 A9
20130120595 Roach et al. May 2013 A1
20130155474 Roach et al. Jun 2013 A1
20130198071 Jurss Aug 2013 A1
20130201534 Carlen Aug 2013 A1
20130223721 Nepomniachtchi et al. Aug 2013 A1
20130297353 Strange Nov 2013 A1
20140032406 Roach et al. Jan 2014 A1
20140067661 Elischer Mar 2014 A1
20140197922 Stanwood et al. Jul 2014 A1
20140236820 Carlton et al. Aug 2014 A1
20140258169 Wong et al. Sep 2014 A1
20140279453 Belchee et al. Sep 2014 A1
20150039528 Minogue et al. Feb 2015 A1
20150090782 Dent Apr 2015 A1
20160034590 Endras et al. Feb 2016 A1
20160142625 Weksler et al. May 2016 A1
20160335816 Thoppae et al. Nov 2016 A1
20170146602 Samp et al. May 2017 A1
20170033761 Beguesse Nov 2017 A1
Foreign Referenced Citations (19)
Number Date Country
2619884 Mar 2007 CA
1897644 Jan 2007 CN
0 984 410 Mar 2000 EP
0984410 Mar 2000 EP
1 855 459 May 2007 EP
2004-23158 Jan 2004 JP
2006-174105 Jun 2006 JP
20040076131 Aug 2004 KR
WO 9614707 May 1996 WO
WO 9837655 Aug 1998 WO
WO 0161436 Aug 2001 WO
WO 0161436 Aug 2001 WO
WO 2004008350 Jan 2004 WO
WO 2005043857 May 2005 WO
WO 2005124657 Dec 2005 WO
WO 2006075967 Jul 2006 WO
WO 2006086768 Aug 2006 WO
WO 2006136958 Dec 2006 WO
WO 2007024889 Mar 2007 WO
Non-Patent Literature Citations (442)
Entry
Fletcher, Lloyd A., and Rangachar Kasturi. “A robust algorithm for text string separation from mixed text/graphics images.” IEEE transactions on pattern analysis and machine intelligence 10.6 (1988): 910-918. (Year: 1988).
Defendant Wells Fargo Bank, N.A.'s Second Amended Answer, Affirmative Defenses, and Counterclaims To Plaintiff's Amended Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Aug. 1, 2019, 72 pgs.
Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Jul. 29, 2019, 36 pgs.
Wells Fargo's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 7 pgs.
USAA's Objections To Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Aug. 12, 2019, 10 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant To Authorization Provided In Paper No. 13, dated Aug. 1, 2019, 9 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petitioner's Supplemental Exhibit List, dated Aug. 1, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Sur-Reply In Support of Patent Owner Preliminary Response, dated Aug. 8, 2019, 8 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Decision Denying Institution of Inter Parties Review, dated Aug. 26, 2019, 28 pgs.
Mitek Video titled “Mobile Deposit Tour”, Published on Jul. 2, 2009 by Mitek Systems, duration 2 minutes and 13 seconds, located on the Internet at: https://www.youtube.com/watch?v=sGD49ybxS2Q, 25 pgs.
Provisional patent application filed by Wells Fargo Bank, dated Jan. 29, 2008, 134 pgs.
SCH0i910 Portable Dualmode Smartphone User Guide by Samsung, Copyright 2009 Samsung Electronics Canada, downloadable from www.manualslib.com, 168 pgs.
U.S. Appl. No. 61/022,279, filed Jan. 18, 2008, (cited in IPR2020-00090, U.S. Pat. No. 9,177,197), 35 pgs.
Panini My Vision X Operator Manual, Panini, 2004, (cited in IPR2020-00093, U.S. Pat. No. 9,892,454), 51 pgs.
Yeo, L.H. et al., “Submission of transaction from mobile workstations in a cooperative multidatabase environment”, IEEE, 1994, (cited in IPR2020-00097, U.S. Pat. No. 7,885,880), 10 pgs.
Cormac Herley, “Recursive Method to Extract Rectangular Objects From Scans”, 4 pages, Oct. 2003.
E. Tochip et al., “Camera Phone Color Appearance Utility—Finding a Way to Identify Camera Phone Picture Color”, 25 pages, 2007.
Patent Disclaimer for U.S. Pat. No, 8,699,779, filed on Mar. 4, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No, 8,977,571, filed on Feb. 20, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No, 9,336,517, filed on Mar. 4, 2019, 2 pgs.
Patent Disclaimer for U.S. Pat. No, 9,818,090, filed on Feb. 20, 2019, 2 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 75 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Tim Crews in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Matthew Calman in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Updated Exhibit List, dated Mar. 19, 2019, 8 pgs.
CBM2019-00003 U.S. Pat. No, 9,336,517, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 91 pgs.
CBM2019-00003 U.S. Pat. No. 8,699,779, Declaration of Matthew Calman in Support of Patent Owner Preliminary Response, dated Mar. 4, 2019, 15 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 42.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 14, dated Apr. 10, 2019, 10 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Tim Crews in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 99 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Matthew Calman in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 43.63(e), dated Mar. 19, 2019, 8 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, United Services Automobile Association's (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 103 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779 Matthew A. Calman Declaration, dated Mar. 4, 2019, 15 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779 Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 147 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Petition for Covered Business Method Review of Claims 1-3, 5-9, 11-16 and 18 of U.S. Pat. No. 9,224,136, dated Mar. 28, 2019, 93 pgs.
CBM2019-00027 U.S. Pat. No. 9,224,136 Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Apr. 8, 2019, 3 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions and Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 94 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petition for Covered Business Method Review of Claims 1-30 of U.S. Pat. No. 10,013,681, dated Mar. 28, 2019, 99 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Petitioner's Updated Exhibit List (as of Apr. 1, 2019) for U.S. Pat. No. 10,013,681, dated Apr. 1, 2019, 5 pgs.
CBM2019-00028 U.S. Pat. No. 10,013,681, Notice of Filing Date Accorded to Petition and Time for Filing Patent owner Preliminary Response for U.S. Pat. No. 10,013,681, dated Apr. 8, 2019, 3 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 76 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Petition for Covered Business Method Review of Claims 1-3, 5-14, 16-29 of U.S. Pat. No. 10,013,605, dated Mar. 28, 2019, 88 pgs.
CBM2019-00029 U.S. Pat. No. 10,013,605, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions and Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Petition for Inter Parties Review of Claims 109 of U.S. Pat. No. 9,818,090, dated Mar. 20, 2019, 56 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Declaration of Peter Alexander, PhD. as filed in the IPR on Mar. 20, 2019, 99 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Mar. 27, 2019, 5 pgs.
IPR2019-00815 U.S. Pat. No. 9,818,090, Exhibit B Proposed Claim Constructions for the '571, '090, '779 and '517 Patents, filed on Feb. 28, 2019, 10 pgs.
ABA Routing System Transit Number, Wikipedia, dated Sep. 27, 2006, 3pgs.
Accredited Standards Committee Technical Report TR 33-2006, dated Aug. 28, 2006, 75 pgs.
ANS X9.100-140-2004, “Specification for an Image Replacement document—IRD”, American Standard for Financial Services, Oct. 1, 2004, 15 pgs.
ANSI News, Check 21 Goes Into Effect Oct. 28, 2004, dated Oct. 25, 2004, 1 pg.
ANSI, “Return Reasons for Check Image Exchange of IRDS”, dated May 6, 2016, 23 pgs.
ANSI, “Specifications For Electronic Exchange of Check and Image Data”, dated Jul. 11, 2006, 230 pgs.
ANSI X9.7-1999(R2007), Bank Check Background and Convenience Amount Field Specification, dated Jul. 11, 2007, 86 pgs.
ASCX9, “Specification for Electronic Exchange of Check and Image Data”, date Mar. 31, 2003. 156 pgs.
Bankers' Hotline, “Training Page: Learning ihe Bank Numbering System”, Copyright 2004, 2 pgs.
BrainJar Validation Algorithms, archived on Mar. 16, 2016 from BrainJar.com, 2 pgs.
Canon White Paper, “Two Words Every Business Should Know—Remote Deposit”, dated 2005, 7 pgs.
CBR online, “Diebold launches ATM depository technology”, Oct. 4, 2007, 5 pgs.
Cheq Information Technology White Paper, “Teller Scanner Performance and Scanner Design: Camera Position Relative to the Feeder”, dated 2005, 7 pgs.
De Jesus, Angie et al., “Distributed Check Processing in a Check 21 Environment”, dated Nov. 2004, 22 pgs.
Federal Reserve Adoption of DSTU X9.37-2003, Image Cash Letter Customer Documentation Version 1.8, dated Oct. 1, 2008, 48 pgs.
Fielding, R. et al, “RFC-2616—Hypertext Transfer Protocol”, Network Working Group, The Internet Society copyright 1999, 177 pgs.
Hill, Simon, “From J-Phone to Lurnina 1020: A Complete History of the Camera Phone”, dated Aug. 11, 2013, 19 pgs.
Instrument—Definition from the Merriam-Webster Online Dictionary, dated Mar. 2, 2019, 1 pg.
Instrument—Definition of instrument from the Oxford Dictionaries (British & World English), dated Jul. 2, 2017, 44 pgs.
IPhone Application Programming Guide Device Support, dated Apr. 26, 2009, 7 pgs.
IPhone Announces the New iPhone 3gs—The Fastest, Most Powerful iPhone Yet, Press Release dated Jun. 8, 2009, 4 pgs.
Klein, Robert, Financial Services Technology, “Image Quality and Usability Assurance: Phase 1 Project”, dated Jul. 23, 2004, 67 pgs.
Lange, Bill, “Combining Remote Capture and IRD Printing, A Check 21 Strategy For Community and Regional Banks”, dated 2005, 25 pgs.
Lee, Jeanne, “Mobile Check Deposits: Pro Tips to Ensure They Go Smoothly”, dated Feb. 19, 2016, 6 pgs.
Meara, Bob, “State of Remote Deposit Capture 2015: Mobile is the New Scanner”, Dated May 26, 2015, obtained from the Internet at: https://www.celent.com/insights/57842967, 3 pgs.
Meara, Bob, “State of Remote Deposit Capture 2015 Mobile is the New Scanner”, dated May 2015, 56 pgs.
Meara, Bob,“USAA's Mobile Remote Deposit Capture”, Dated Jun. 26, 2009, 2 pgs.
Mitek's Mobile Deposit Processes More Than Two Billion Checks, $1.5 Trillion in Cumulative Check Value, dated Mar. 18, 2018, 2 pgs.
Mitek, “Video Release—Mitek MiSnap™ Mobile Auto Capture Improves Mobile Deposit® User Experience at Ten Financial Institutions”, dated Jul. 15, 2014, 2 pgs.
NCR, Mobile Remote Deposit Capture (RDC), copyright 2011, 8 pgs.
Nokia N90 Review Digital Trends, dated Feb. 11, 2019, obtained from the Internet at: https://www.digitaltrends.com/cell-phone-reviews/nokia-n90-review/, 11 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 1 of 3, 67 pgs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 2 of 3, 60gs.
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 3 of 3, 53 pgs.
Patel, Kunur, Ad Age, “How Mobile Technology is Changing Banking's Future”, dated Sep. 21, 2009, 3 pgs.
Remote Deposit Capture Basic Requirements, dated Aug. 22, 2009, 1 pg.
Remote Deposit Capture.com Scanner Matrix, dated Oct. 21, 2011, 3 pgs.
Rowles, Tony, USAA-v. Wells Fargo No. 2:16-cv-245-JRGL e-mail correspondence dated Jan. 24, 2019, 2 pgs.
Sechrest, Stuart et al., “Windows XP Performance”, Microsoft, dated Jun. 1, 2001, 20 pgs.
Spenser, Harvey, “White Paper Check 21 Controlling Image Quality at the Point of Capture”, dated 2004, 7 pgs.
Timothy R. Crews list of Patents, printed from the United States Patent and Trademark Office on Feb. 13, 2019, 7 pgs.
Van Dyke, Jim, “2017 Mitek Mobile Deposit Benchmark Report”, copyright 2017, 50 pgs.
Wausau, “Understanding Image Quality & Usability Within a New Environment”, copyright 2019, 1 pg.
Whitney, Steve et al., “A Framework For Exchanging Image Returns”, dated Jul. 2001, 129 pgs.
12 CRF § 229.51 and Appendix D to Part 229 (Jan. 1, 2005 edition), 3 pgs.
149 Cong. Rec. H9289, Oct. 8, 2003, 6 pgs.
“Accept “Customer Not Present” Checks,” Accept Check Online, http://checksoftware.com, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
Apple Announces the New iPhone 3GS—The Fastest, Most Powerful iPhone Yet, Jun. 8, 2009, located on the Internet at: http://www.apple.com.rensroom/2009/06/08Apple-Annpounces-the-New-iPhone-3GS-The Fastest-Most-Powerful-iPhone-Yet, 4 pgs.
Apple Reinvents the Phone with iPhone, Jan. 2007, located on the Internet at: https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/, 4 pgs.
Askey, Canon EOS 40D Review (pts.1,4,10), Digital Photography Review, located on the Internet at: : https://www.dpreview.com/reviews/canoneos40d, 24 pgs.
Askey, Leica Digilux 2 Review (pts.1,3,7), Digital Photography Review, May 20, 2004, located on the Internet at: : https://www.dpreview.com/reviews/leicadigilux2, 20 pgs.
Askey, Nikon D300 In-depth Review (pts.1,3,9), Digital Photography Review, Mar. 12, 2008, located on the Internet at: : https://www.preview.com/reviews/nikond300, 24 pgs.
Askey, Panasonic Lumix DMC-L1 Review (pts.1,3,7), Digital Photography Review, Apr. 11, 2007, located on the Internet at: https://www.dpreview.com/reviews/panasonicdmc11, 24 pgs.
Askey, Sony Cyber-shot DSC-R1 Review (pts,1,3,7), Digital Photography Review, Dec. 6, 2005, located on the Internet at: http://www.dpreview.com.reviews/sonydscr1, 24 pgs.
Automated Clearing Houses (ACHs), Federal Reserve Bank of New York (May 2000) available at: https://www.newyorkfed.org/aboutthefed/fedpoint/fed31.html, (attached as Exhibit 12 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 4 pgs.
“Adjusting Brightness and Contrast”, www.eaglesoftware.com/adjustin.htm, retrieved on May 4, 2009 (4 pgs).
Berman, How Hitchcock Turned a Small Budget Into a Great Triumph, Time.com, Apr. 29, 2015, located on the Internet at: http://time.com/3823112/alfred-hitchcock-shadow-of-a-doubt, 1 pg.
“Best practices for producing quality digital image files,” Digital Images Guidelines, http://deepblue.lib.umich.edu/bitstream/2027.42/40247/1/Images-Best_Practice.pdf, downloaded 2007 (2 pgs).
Big Red Book, Adobe Systems Incorporated, copyright 2000, (attached as Exhibit 27 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 45 pgs.
Canon EOS 40D Digital Camera Instruction Manual, located on the Internet at http://gdlp01.c-wss.com/gds/6/0900008236/01/EOS40D_HG_EN.pdf (attached as Exhibit 6 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 38 pgs.
“Chapter 7 Payroll Programs,” Uniform Staff Payroll System, http://www2.oecn.k12.oh.us/www/ssdt/usps/usps_user_guide_005.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (9 pgs).
“Check 21—The check is not in the post”, RedTitan Technology 2004 http://www.redtitan.com/check21/htm (3 pgs).
“Check 21 Solutions,” Columbia Financial International, Inc. http://www.columbiafinancial.us/check21/solutions.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
Check Clearing for the 21st Century Act Foundation for Check 21 Compliance Training, Federal Financial Institutions Examination Council, (Oct. 16, 2004), available on the Internet at: https://web.archive.org/web/20041016100648/https://www.ffiec.gov/exam/check 21/check21foundationdoc.htm, (excerpts attached as Exhibit 20 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018),11 pgs.
“Check Fraud: A Guide to Avoiding Losses”, All Net, http://all.net/books/audit/checkfraud/security.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
Chen, Brian et al., iPhone 3GS Trounces Predecessors, Rivals in Web Browser Speed Test, Wired, Jun. 24, 2009, located on the Internet at: www.wired.com/2009.3gs-speed/, 10 pgs.
“Clearing House Electronic Check Clearing System (CHECCS) Operating Rules,” An IP.com Prior Art Database Technical Disclosure, Jul. 29, 2015 (35 pgs).
“Compliance with Regulation CC”, http./www/federalreserve.gov/Pubs/regcc/regcc.htm, Jan. 24, 2006 (6 pgs).
“Customer Personalized Bank Checks and Address Labels” Checks Your Way Inc., http://www.checksyourway.com/htm/web_pages/faq.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (6 pgs).
Defendant Wells Fargo Bank, N.A.'s Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, dated Aug. 14, 2018, 64 pgs.
Declaration of Peter Alexander, Ph.D., CBM2019-0004, Nov. 8, 2018, 180 pgs.
“Deposit Now: Quick Start User Guide,” BankServ, 2007, 29 pages.
“Direct Deposit Application for Payroll”, Purdue University, Business Office Form 0003, http://purdue.edu/payroll/pdf/directdepositapplication.pdf, Jul. 2007 (2 pgs).
“Direct Deposit Authorization Form”, www.umass.edu/humres/library/DDForm.pdf, May 2003 (3 pgs).
“Direct Deposit,” University of Washington, http://www.washington.edu/admin/payroll/directdeposit.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Electronic Billing Problem: The E-check is in the mail” American Banker—vol. 168, No. 95, May 19, 2003 (4 pgs).
Excerpts from American National Standard for Financial Services, ANS, X9.100-140-2004—Specifications for an Image Replacement Document—IRD, Oct. 1, 2004, 16 pgs.
“First Wireless Handheld Check and Credit Card Processing Solution Launched by Commericant®, MobileScape® 5000 Eliminates Bounced Checks, Enables Payments Everywhere,” Business Wire, Mar. 13, 2016, 3 pages.
“Frequently Asked Questions” Bank of America, http://www/bankofamerica.com/deposits/checksave/index.cfm?template-lc_faq_bymail, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (2 pgs).
“Full Service Direct Deposit”, www.nonprofitstaffing.com/images/upload/dirdepform.pdf. Cited in U.S. Pat. No. 7,900,822, as dated 2001, (2 pgs).
Gates, A History of Wireless Standards, Wi-Fi Back to Basics, Aerohive Blog, Jul. 2015, located on the Internet at: http://blog.aerohive.com/a-history-of-wireless-standards, 5 pgs.
Guidelines for Evaluation of Radio Transmission Technologies for IMT-2000, dated 1997, ITU-R-M.1225, located on the Internet at: https://www.itu.int/dms-pubrec/itu-r/rec/m/R-REC-M,1225-0-199702-I!!PDF-E.pdf, 60 pgs.
Helio Ocean User Manual, located on the Internet at: https://standupwireless.com/wp-content/uploads/2017/04/Manual_PAN-TECH_OCEAN.pdf (excerpts attached as Exhibit 10 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 76 pgs.
“How to Digitally Deposit a Check Image”, Smart Money Daily, Copyright 2008 (5 pgs).
HTC Touch Diamond Manual, copyright 2008, (attached as Exhibit 11 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 257 pgs.
Humphrey, David B. and Hunt, Robert, “Getting Rid of Paper: Savings From Check 21”, Working Paper No. 12-12, Research Department, Federal Reserve Bank of Philadelphia, (May 2012), available on the Internet at: https://philadelphiafed.org/-/media/research-and-data/publications/working-papers/2012/wp12-12.pdf, (attached as Exhibit 14 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 29 pgs.
“ImageNet Mobile Deposit Provides Convenient Check Deposit and Bill Pay to Mobile Consumers,” Miteksystems, 2008 (2 pgs).
IPhone Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: iPhone Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: https://www.apple.com/newsroom/2008/07/14iPhone-App-Store-Downloads-Top-10-Million-in-First-Weekend, 4pgs.
“It's the easiest way to Switch banks”, LNB, http://www.inbky.com/pdf/LNBswitch-kit10-07.pdf Cited in U.S. Pat. No. 7,996,316, as dated 2007 (7 pgs).
Joinson et al., Olympus E-30 Review (pts.1,4,8), Digital Photography Review, Mar. 24, 2009, located on the Internet at: www.dpreview.com/reviews/olympus30, 6 pgs.
Knerr et al., The A2iA Intercheque System: Courtesy Amount and Legal Amount Recognition for French Checks in Automated Bankcheck Processing 43-86, Impedove et al. eds., 1997, 50 pgs.
Lacker, Jeffrey M., “Payment System Disruptions and the Federal Reserve Following Sep. 11, 2001”, The Federal Reserve Bank of Richmond, (Dec. 23, 2003) (attached as Exhibit 19 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 55 pgs.
Leica Digilux 2 Instructions located on the Internet: http://www.overgaard.dk/pdf/d2_manual.pdf (attached as Exhibit 2 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 95 pgs.
“Lesson 38—More Bank Transactions”, Turtle Soft, http://www.turtlesoft.com/goldenseal-software-manual.lesson38.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs).
“Machine Accepts Bank Deposits”, New York Times, Apr. 12, 1961, 1 pg.
Mackenzie, E., Photography Made Easy, copyright 1845, 80 pgs.
“Middleware”, David E. Bakken, Encyclopedia of Distributed Computing, Kluwer Academic Press, 2001 (6 pgs).
“Mitek Systems Announces Mobile Deposit Application for Apple iPhone,” http://prnewswire.com/cgi-bin/stories/pl?ACCT=104&STORY=/www/story/10-01- . . . , Nov. 25, 2008 (2 pgs).
Motorola RAZR MAXX V6 User Manual, located on the Internet at: https://www.phonearena.com/phones/Motorola-RAZR-MAXX-V6_id1680, Kattached as Exhibit 7 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 36 pgs.
MOTOMANUAL for MOTORAZR, located on the Internet at: https://www.cellphones.ca/downloads/phones/manuals/motorola-razr-v3xx-manual.pdf (excerpts attached as Exhibit 8 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 34 pgs.
Nikon Digital Camera D300 User's Manual, located on the Internet at: http://download.nikonimglib.com/archive2/iBuJv00Aj97i01y8BrK49XX0Ts69/D300,EU(En)04.pdf (attached as Exhibit 5 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 195 pgs.
Nokia N95 8GB User Guide, copyright 2009, located on the Internet at: https://www.nokia.com/en_int/phones/sites/default/files/user-guides/Nokia_N95_8GB_Extended_UG_en.pdf (excerpts attached as Exhibit 9 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 77 pgs.
“NOVA Enhances Electronic Check Service to Benefit Multi-Lane Retailers,” Business Wire, Nov. 28, 2006, 2 pages.
Panasonic Operating Instructions for Digital Camera/Lens Kit Model No. DMC-L1K, https://www.panasonic.com/content/dam/Panasonic/support_manual/Digital_Still_Camera/English_01-vqt0-vqt2/vqt0w95_L1_oi.pdf (attached as Exhibit 4 from the Defendant Wells Fargo Back N.A.'s Answer dated Aug. 14, 2018), 129 pgs.
“Personal Finance”, PNC, http://www.pnc.com/webapp/unsec/productsandservice.do?sitearea=/PNC/home/personal/account+services/quick+switch/quick+switch+faqs, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
POP, ARC and BOC—A Comparison, Federal Reserve Banks, at 1(Jan. 7, 2009), available on the Internet at: https://web.archive.org/web/20090107101808/https://www.frbservices.org/files/eventseducation/ pdf/pop_arc_boc_comparison.pdf (attached as Exhibit 13 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 3 pgs.
Quinn and Roberds, The Evolution of the Check as a Means of Payment: A Historical Survey, Federal Reserve Bank of Atlanta, Economic Review, 2008, 30 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,818,090, dated Nov. 8, 2018, 90 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,336,517, dated Nov. 8, 2018, 98 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 8,977,571, dated Nov. 8, 2018, 95 pgs.
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-23 of U.S. Pat. No. 8,699,779, dated Nov. 8, 2018, 101 pgs.
“Refractive index” Wikipedia, the free encyclopedia; http://en.wikipedia.org./wiki/refractiveindex.com Oct. 16, 2007 (4 pgs).
“Remote check deposit is the answer to a company's banking problem,” Daily Breeze, Torrance, CA, Nov. 17, 2006, 2 pgs.
“Remote Deposit Capture”, Plante & Moran, http://plantemoran.com/industries/fincial/institutions/bank/resources/community+ bank+advisor/2007+summer+issue/remote+deposit+capture.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“Remote Deposit” National City, http://www.nationalcity.com/smallbusiness/cashmanagement/remotedeposit/default.asp; Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg).
Rockwell, The Megapixel Myth, KenRickwell.com, 2008, located on the Internet at: http://kewrockwell.com.tech/mpmyth.htm, 6 pgs.
“Save on ATM Fees”, RedEye Edition, Chicago Tribune, Chicago, IL Jun. 30, 2007 (2 pgs).
Shah, Moore's Law, Continuous Everywhere But Differentiable Nowhere, Feb. 12, 2009, located on the Internet at: http://samjshah.com/2009/02/24/morres-law/, 5 pgs.
“SNB Check Capture: Smartclient User's Guide,” Nov. 2006, 21 pgs.
Sony Digital Camera User's Guide/ Trouble Shooting Operating Instructions, copyright 2005, located on the Internet at: https://www.sony.co.uk/electronics/support/res/manuals/2654/26544941M.pdf (attached as Exhibit 3 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 136 pgs.
Sumits, Major Mobile Milestones—The Last 15 Years, and the Next Five, Cisco Blogs, Feb. 3, 2016, located on the Internet at: https://blogs.cisco.com/sp/mobile-vni-major-mobile-milesrones-the-last15-years-and-the-next-five, 12 pgs.
“Switching Made Easy,” Bank of North Georgia, http://www.banknorthgeorgia.com/cmsmaster/documents/286/documents616.pdf, 2007 (7 pgs).
“Two Words Every Business Should Know: Remote Deposit,” Canon, http://www.rpsolutions.com/rpweb/pdfs/canon_rdc.pdf, 2005 (7 pgs).
“Virtual Bank Checks”, Morebusiness.com, http://www.morebusiness.com/running_yourbusiness/businessbits/d908484987.brc, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs).
“WallStreetGrapevine.com” Stocks on the Rise: JADG, BKYI, MITK; Mar. 3, 2008 (4 pgs).
Wausau Financial Systems, Understanding Image Quality & Usability Within a New Environment, 2006, 22 pgs.
“What is check Fraud”, National Check Fraud Center, http://www.ckfraud.org/ckfraud.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs).
“Exchangeable image file format for digital still cameras: Exif Version 2.2,” Standard of Electronics and Information Technology Industries Associate, JEITA CP-3451, Technical Standardization Committee on AV & IT Storage Systems and Equipments, Japan Electronics and Information Technology Industries Association, Apr. 2002 (154 pgs). (retrieved from: http://www.exif.org/Exif2-2.PDF).
Affinity Federal Credit Union, “Affinity Announces Online Deposit,” Aug. 4, 2005 (1 pg).
Albrecht, W. Steve, “Check Kiting: Detection, Prosecution and Prevention,” The FBI Law Enforcement Bulletin, Nov. 1, 1993 (6 pgs).
Alves, Vander and Borba, Paulo; “Distributed Adapters Pattern: A Design for Object-Oriented Distributed Applications”; First Latin American Conference on Pattern Languages of Programming; Oct. 2001; pp. 132-142; Rio de Janeiro, Brazil (11 pgs).
Amber Avalona-Butler / Paraglide, “At Your Service: Best iPhone Apps for Military Lifestyle,” Jul. 9, 2010 (2 pgs).
Anderson, Milton M. “FSML and Echeck”, Financial Services Technology Consortium, 1999 (17 pgs).
Aradhye, Hrishikesh B., “A Generic Method for Determining Up/Down Orientation of Text in Roman and Non-Roman Scripts,” Pattern Recognition Society, Dec. 13, 2014, 18 pages.
Archive Index Systems; Panini My Vision X-30 or VX30 or X30 © 1994-2008 Archive Systems, Inc. P./O. Box 40135 Bellevue, WA USA 98015 (2 pgs).
Associate of German Banks, SEPA 2008: Uniform Payment Instruments for Europe, Berlin, Cited in U.S. Pat. No. 7,900,822, as dated Jul. 2007, Bundesverbankd deutscher banken ev, (42 pgs).
Automated Merchant Systems, Inc., “Electronic Check Conversion,” http://www.automatedmerchant.com/electronic_check_conversion.cfm, 2006, downloaded Oct. 18, 2006 (3 pgs).
Bank Systems & Technology, Untitled Article, May 1, 2006, http://www.banktech.com/showarticle.jhtml? articleID=187003126, “Are you Winning in the Payment World?” (4 pgs).
BankServ, “DepositNow: What's the difference?” Cited in U.S. Pat. No. 7,970,677, as dated 2006, (4 pgs).
BankServ, Product Overview, http://www.bankserv.com/products/remotedeposit.htm, Cited in U.S. Pat. No. 7,970,677, as dated 2006, (3 pgs).
Bills, Steve, “Automated Amount Scanning is Trend in Remote-Deposit,” American Banker, New York, NY, Aug. 30, 2005, (3 pgs).
Blafore, Bonnie “Lower Commissions, Fewer Amenities”, Better Investing, Madison Heights: Feb. 2003, vol. 52, Iss 6, (4 pgs).
BLM Technologies, “Case Study: Addressing Check 21 and RDC Error and Fraud Threats,” Remote Deposit Capture News Articles from Jun. 11, 2007, Retrieved from http://www.remotedepositcapture.com/News/june_11_2007.htm on Feb. 19, 2008 (5 pgs).
Blue Mountain Consulting, from URL: www.bluemontainconsulting.com, Cited in U.S. Pat. No. 7,900,822, as dated Apr. 26, 2006 (3 pgs).
Board of Governors of the federal reserve system, “Report to the Congress on the Check Clearing for the 21st Century Act of 2003” Apr. 2007, Submitted to Congress pursuant to section 16 of the Check Clearing for the 21st Century Act of 2003, (59 pgs).
Braun, Tim, “Camdesk—Towards Portable and Easy Document Capture,” Image Understanding and Pattern Recognition Research Group, Department of Computer Science, University of Kaiserslautern, Technical Report, Mar. 29, 2005 (64 pgs). (Retrieved from: https://pdfs.semanticscholar.org/93b2/ea0d12f24c91f3c46fa1c0d58a76bb132bd2.pdf).
Bruene, Jim; “Check Free to Enable In-Home Remote Check Deposit for Consumers and Small Business”, NetBanker. Com, Financial Insite, Inc., http://www. netbanker.com/2008/02/checkfree_to_enableinhome_rem.html, Feb. 5, 2008 (3 pgs).
Bruene, Jim; “Digital Federal Credit Union and Four Others Offer Consumer Remote Deposit Capture Through EasCorp”, NetBanker—Tracking Online Finance, www.netbanker.com/2008/04/digital_federal_credit_union_a.html, Apr. 13, 2008 (3 pgs).
Bruno, M., “Instant Messaging,” Bank Technology News, Dec. 2002 (3 pgs).
Burnett, J. “Depository Bank Endorsement Requirements,” BankersOnline.com, http://www.bankersonline.com/cgi-bin/printview/printview.pl, Jan. 6, 2003 (3 pgs).
Canon, ImageFormula CR-25/CR-55, “Improve Your Bottom Line with Front-Line Efficiencies”, 0117W117, 1207-55/25-1 OM-BSP, Cited in U.S. Pat. No. 7,949,587 as dated 2007. (4 pgs).
Carrubba, P. et al., “Remote Deposit Capture: A White Paper Addressing Regulatory, Operational and Risk Issues,” NetDeposit Inc., 2006 (11 pgs).
Century Remote Deposit High-Speed Scanner User's Manual Release 2006, (Century Manual), Century Bank, 2006, (32 pgs).
Chiang, Chuck, The Bulletin, “Remote banking offered”, http://bendbulletin.com/apps/pbcs.dll/article?AID=/20060201/BIZ0102/602010327&templ . . . , May 23, 2008 (2 pgs).
CNN.com/technology, “Scan, deposit checks from home”, www.cnn.com/2008ITECH/biztech/02/07/check.scanning.ap/index.html, Feb. 7, 2008 (3 pgs).
Constanzo, Chris, “Remote Check Deposit: Wells Captures A New Checking Twist”, Bank Technology News Article—May 2005, www.americanbanker.com/btn_article.html?id=20050502YQ50FSYG (2 pgs).
Craig, Ben, “Resisting Electronic Payment Systems: Burning Down the House?”, Federal Reserve Bank of Cleveland, Jul. 1999 (4 pgs).
Creativepaymentsolutions.com, “Creative Payment Solutions—Websolution,” www.creativepaymentsolution.com/cps/financialservices/websolution/default.html, Copyright 2008, Creative Payment Solutions, Inc. (1 pg).
Credit Union Journal, “The Ramifications of Remote Deposit Capture Success”, www.cuiournal.com/orintthis.html?id=20080411 EODZT57G, Apr. 14, 2008 (1 pg).
Credit Union Journal, “AFCU Averaging 80 DepositHome Transactions Per Day”, Credit Union Journal, Aug. 15, 2005 (1 pg).
Credit Union Management, “When You wish Upon an Imaging System . . . the Right Selection Process can be the Shining Star,” Credit Union Management, Aug. 1993, printed from the internet at <http://search.proquest.com/docview/227756409/14138420743684F7722/15?accountid=14 . . . >, on Oct. 19, 2013 (11 pgs).
DCU Member's Monthly—Jan. 2008, “PC Deposit—Deposit Checks from Home!”, http://www.mycreditunionnewsletter.com/dcu/01 08/page1. html, Copyright 2008 Digital Federal Credit Union (2 pgs).
De Jesus, A. et al., “Distributed Check Processing in a Check 21 Environment: An educational overview of the opportunities and challenges associated with implementing distributed check imaging and processing solutions,” Panini, 2004, pp. 1-22.
De Queiroz, Ricardo et al., “Mixed Raster Content (MRC) Model for Compound Image Compression”, 1998 (14 pgs).
Debello, James et al., “RDM and Mitek Systems to Provide Mobile Check Deposit,” Mitek Systems, Inc., San Diego, California and Waterloo, Ontario, (Feb. 24, 2009), 2 pgs.
DeYoung, Robert; “The Financial Performance of Pure Play Internet Banks” Federal Reserve Bank of Chicago Economic Perspectives; 2001; pp. 60-75; vol. 25, No. 1 (16pgs).
Dias, Danilo et al., “A Model for the Electronic Representation of Bank Checks”, Brasilia Univ. Oct. 2006 (5 pgs).
Digital Transactions News, “An ACH-Image Proposal For Check Roils Banks and Networks” May 26, 2006 (3 pgs).
Dinan, R.F. et al., “Image Plus High Performance Transaction System”, IBM Systems Journal, 1990 vol. 29, No. 3 (14 pgs).
Doermann, David et al., “Progress in Camera-Based Document Image Analysis,” Proceedings of the Seventh International Conference on Document Analysis and Recognition (ICDAR 2003) 0-7695-1960-1/03, 2003, IEEE Computer Society, 11 pages.
Duvall, Mel, “Remote Deposit Capture,” Baseline, vol. 1, Issue 70, Mar. 2007, 2 pgs.
ECU Technologies, “Upost Remote Deposit Solution,” Retrieved from the internet https://www.eutechnologies.com/products/upost.html, downloaded 2009 (1 pg).
EFT Network Unveils FAXTellerPlus, EFT Network, Inc., www.eftnetwork.com, Jan. 13, 2009 (2 pgs).
ElectronicPaymentProviders, Inc., “FAQs: ACH/ARC, CheckVerification/Conversion/Guarantee, RCK Check Re-Presentment,” http://www.useapp.com/faq.htm, downloaded Oct. 18, 2006 (3 pgs).
Federal Check 21 Act, “New Check 21 Act effective Oct. 28, 2004: Bank No Longer Will Return Original Cancelled Checks,” Consumer Union's FAQ's and Congressional Testimony on Check 21, www.consumerlaw.org.initiatives/content/check21_content.html, Cited in U.S. Pat. No. 7,873,200, as dated Dec. 2005 (20 pgs).
Federal Reserve Board, “Check Clearing for the 21st Century Act”, FRB, http://www.federalreserve.gov/paymentsystems/truncation/, Mar. 1, 2006 (1 pg).
Federal Reserve System, “12 CFR, Part 229 [Regulation CC; Docket No. R-0926]: Availability of Funds and Collection of Checks,” Federal Registrar, Apr. 28, 1997, pp. 1-50.
Federal Reserve System, “Part IV, 12 CFR Part 229 [Regulation CC; Docket No. R-1176]: Availability of Funds and Collection of Checks; Final Rule,” Federal Registrar, vol. 69, No. 149, Aug. 4, 2004, pp. 47290-47328.
Fest, Glen., “Patently Unaware” Bank Technology News, Apr. 2006, Retrieved from the internet at URL:http://banktechnews.com/article.html?id=2006403T7612618 (5 pgs).
Fidelity Information Services, “Strategic Vision Embraces Major Changes in Financial Services Solutions: Fidelity's long-term product strategy ushers in new era of application design and processing,” Insight, 2004, pp. 1-14.
Fisher, Dan M., “Home Banking in the 21st Century: Remote Capture Has Gone Retail”, May 2008 (4 pgs).
Furst, Karen et al., “Internet Banking: Developments and Prospects”, Economic and Policy Analysis Working Paper 2000-9, Sep. 2000 (60 pgs).
Garry, M., “Checking Options: Retailers face an evolving landscape for electronic check processing that will require them to choose among several scenarios,” Supermarket News, vol. 53, No. 49, 2005 (3 pgs).
German Shegalov, Diplom-Informatiker, “Integrated Data, Message, and Process Recovery for Failure Masking in Web Services”, Dissertation Jul. 2005 (146 pgs).
Gupta, Amar et al., “An Integrated Architecture for Recognition of Totally Unconstrained Handwritten Numerals”, WP#3765, Jan. 1993, Productivity from Information Technology “Profit” Research Initiative Sloan School of Management (20 pgs).
Gupta, Maya R. et al., “OCR binarization and image pre-processing for searching historical documents,” Pattern Recognition, vol. 40, No. 2, Feb. 2007, pp. 389-397.
Hale, J., “Picture this: Check 21 uses digital technology to speed check processing and shorten lag time,” Columbus Business First, http://columbus.bizjournals.com/columbus/stories/2005/03/14focus1.html, downloaded 2007 (3 pgs).
Hartly, Thomas, “Banks Check Out New Image”, Business First, Buffalo: Jul. 19, 2004, vol. 20, Issue 43, (3 pgs).
Heckenberg, D. “Using Mac OS X for Real-Time Image Processing” Oct. 8, 2003 (15 pgs).
Herley, Cormac, “Efficient Inscribing of Noisy Rectangular Objects in Scanned Images,” 2004 International Conference on Image Processing, 4 pages.
Hildebrand, C. et al., “Electronic Money,” Oracle, http://www.oracle.com/oramag/profit/05-feb/p15financial.html, 2005, downloaded Oct. 18, 2006 (5 pgs).
Hillebrand, G., “Questions and Answers About the Check Clearing for the 21st Century Act, Check 21,” ConsumersUnion.org, http://www.consumersunion.org/finance/ckclear1002.htm, Jul. 27, 2004, downloaded Oct. 18, 2006 (6 pgs).
Iida, Jeanne, “The Back Office: Systems—Image Processing Rolls on as Banks ReapBenefits,” American Banker, Jul. 19, 1993, printed from the internet at <http://search.proquest.com/docview/292903245/14138420743684F7722/147accountid=14 . . . >, on Oct. 19, 2013 (3 pgs).
Image Master, “Photo Restoration: We specialize in digital photo restoration and photograph repair of family pictures”, http://www.imphotorepair.com, Cited in U.S. Pat. No. 7,900,822, as downloaded Apr. 2007 (1 pg).
Investment Systems Company, “Portfolio Accounting System,” 2000, 34, pgs.
JBC, “What is a MICR Line?,” eHow.com, retrieved from http://www.ehow.com/about_4684793_what-micr-line.html on May 4, 2009 (2 pgs).
Johnson, Jennifer J., Secretary of the Board; Federal Reserve System, 12 CFR Part 229, Regulation CC; Docket No. R 1176, “Availability of Funds and Collection of Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2009, (89 pgs).
Kendrick, Kevin B., “Check Kiting, Float for Purposes of Profit,” Bank Security & Fraud Prevention, vol. 1, No. 2, 1994 (3 pgs).
Kiser, Elizabeth K.; “Modeling the Whole Firm: The Effect of Multiple Inputs and Financial Intermediation on Bank Deposit Rates;” FEDS Working Paper No. 2004-07; Jun. 3, 2003; pp. 1-46 (46 pgs).
Knestout, Brian P. et al., “Banking Made Easy” Kiplinger's Personal Finance Washington, Jul. 2003, vol. 57, Iss 7 (5 pgs).
Kornai Andras et al., “Recognition of Cursive Writing on Personal Checks”, Proceedings of International Workshop on the Frontiers in Handwriting Recognition, Cited in U.S. Pat. No. 7,900,822, as dated Sep. 1996, (6 pgs).
Lampert, Christoph et al., “Oblivious Document Capture and Real-Time Retrieval,” International Workshop on Camera Based Document Analysis and Recognition (CBDAR), 2005 (8 pgs). (Retrieved from: http://www-cs.ccny.cuny.edu/˜wolberg/capstone/bookwarp/LampertCBDAR05.pdf).
Levitin, Adam J., Remote Deposit Capture: A Legal and Transactional Overview, Banking Law Journal, p. 115, 2009 (RDC), 8 pgd.
Liang, Jian et al., Camera-Based Analysis of Text and Documents: A Survey, International Journal on Document Analysis and Recognition, Jun. 21, 2005, 21 pages.
Luo, Xi-Peng et al., “Design and Implementation of a Card Reader Based on Build-In Camera,” Proceedings of the 17th International Conference on Pattern Recognition, 2004, 4 pages.
Masonson, L., “Check Truncation and ACH Trends—Automated Clearing Houses”, healthcare financial management associate, http://www.findarticles.com/p/articles/mLm3276/is_n7_v47/ai_14466034/print, 1993 (2 pgs).
Matthews, Deborah, “Advanced Technology Makes Remote Deposit Capture Less Risky,” Indiana Bankers Association, Apr. 2008 (2 pgs).
Metro 1 Credit Union, “Remote Banking Services,” hltp://ww\\i.metro1cu.org/metro1cu/remote.html, downloaded Apr. 17, 2007 (4 pgs).
Mitek systems, “Imagenet Mobile Deposit”, San Diego, CA, downloaded 2009 (2 pgs).
Mitek Systems: Mitek Systems Launches First Mobile Check Deposit and Bill Pay Application, San Diego, CA, Jan. 22, 2008 (3 pgs).
Mohl, Bruce, “Banks Reimbursing ATM Fee to Compete With Larger Rivals”, Boston Globe, Boston, MA, Sep. 19, 2004 (3 pgs).
Moreau, T., “Payment by Authenticated Facsimile Transmission: a Check Replacement Technology for Small and Medium Enterprises,” CONNOTECH Experts-conseils, Inc., Apr. 1995 (31 pgs).
Nelson, B. et al., “Remote deposit capture changes the retail landscape,” Northwestern Financial Review, http://findarticles.com/p/articles/mi qa3799/is200607/ai_n16537250, 2006 (3 pgs).
Netbank, Inc., “Branch Out: Annual Report 2004,” 2004 (150 pgs).
Netbank, Inc., “Quick Post: Deposit and Payment Forwarding Service,” 2005 (1 pg).
NetDeposit Awarded Two Patents for Electronic Check Process, NetDeposit, Jun. 18, 2007, (1 pg).
Nixon, Julie et al., “Fiserv Research Finds Banks are Interested in Offering Mobile Deposit Capture as an,” Fiserv, Inc. Brookfield, Wis., (Business Wire), (Feb. 20, 2009), 2 pgs.
Online Deposit: Frequently Asked Questions, http://www.depositnow.com/faq.html, Copyright 2008 (1 pg).
Onlinecheck.com/Merchant Advisors, “Real-Time Check Debit”, Merchant Advisors: Retail Check Processing Check Conversion, http://www.onlinecheck/wach/rcareal.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (3 pgs).
Oxley, Michael G., from committee on Financial Services; “Check Clearing For The 21st Century Act”, 108th Congress, 1st Session House of Representatives report 108-132, Jun. 2003 (20 pgs).
Oxley, Michael G., from the committee of conference; “Check Clearing For the 21st Century Act” 108th Congress, 1st Session Senate report 108-291, Oct. 1, 2003 (27 pgs).
Palacios, Rafael et al., “Automatic Processing of Brazilian Bank Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2002 (28 pgs).
Patterson, Scott “USAA Deposit@Home—Another WOW moment for Net Banking”, NextCU.com, Jan. 26, 2007 (5 pgs).
Public Law 108-100, 108 Congress; “An Act Check Clearing For the 21st Century Act”, Oct. 28, 2003, 117 STAT. 1177 (18 pgs).
Rao, Bharat; “The Internet and The Revolution in Distribution: A Cross-Industry Examination”; Technology in Society; 1999; pp. 287-306; vol. 21, No. 3 (20 pgs).
Remotedepositcapture, URL:www.remotedepositcapture.com, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (5 pgs).
RemoteDepositCapture.com, “PNC Bank to Offer Ease of Online Deposit Service Integrated with QuickBooks to Small Businesses”, Remote Deposit Capture News Articles from Jul. 24, 2006, (2 pgs).
RemoteDepositCapture.com, Remote Deposit Capture News Articles from Jul. 6, 2006, “BankServ Announces New Remote Deposit Product Integrated with QuickBooks” (3 pgs).
Remotedepsitcapture.com, LLC, “Remote Deposit Capture Overview,” ROC Overview, http://remotedepositcapture.com/overview/RDC_overview.htm, Cited in U.S. Pat. No. 7,900,822, as dated Mar. 12, 2007 (4 pgs).
Richey, J. C. et al., “EE 4530 Check Imaging,” Nov. 18, 2008 (10 pgs).
Ritzer, J.R. “Hinky Dinky helped spearhead POS, remote banking movement”, Bank Systems and Equipment, vol. 21, No. 12, Dec. 1984 (1 pg).
Rivlin, Alice M. et al., Chair, Vice Chair—Board of Governors, Committee on the Federal Reserve in the Payments Mechanism—Federal Reserve System, “The Federal Reserve in the Payments Mechanism”, Jan. 1998 (41 pgs).
Rose, Sarah et al., “Best of the We: The Top 50 Financial Websites”, Money, New York, Dec. 1999, vol. 28, Iss. 12 (8 pgs).
Shelby, Hon. Richard C. (Committee on Banking, Housing and Urban Affairs); “Check Truncation Act of 2003”, calendar No. 168, 108th Congress, 1st Session Senate report 108-79, Jun. 2003 (27 pgs).
SoyBank Anywhere, “Consumer Internet Banking Service Agreement,” Dec. 6, 2004 (6 pgs).
Teixeira, D., “Comment: Time to Overhaul Deposit Processing Systems,” American Banker, Dec. 10, 1998, vol. 163, No. 235, p. 15 (3 pgs).
Thailandguru.com: How and where to Pay Bills @ www.thailandguru.com/paying-bills.html, © 1999-2007 (2 pgs).
The Automated Clearinghouse, “Retail Payment Systems; Payment Instruments Clearing and Settlement: The Automated Clearinghouse (ACH)”, www.ffiec.gov/ffiecinfobase/booklets/retailretail_02d.html, Cited in U.S. Pat. No. 7,900,822, as dated Dec. 2005 (3 pgs).
The Green Sheet 2.0: Newswire, “CO-OP adds home deposit capabilities to suite of check imaging products”, www.greensheet.com/newswire.php?newswire_id=8799, Mar. 5, 2008 (2 pgs).
Tygar, J.D., Atomicity in Electronic Commerce, In ACM Networker, 2:2, Apr./May 1998 (12 pgs).
Valentine, Lisa, “Remote Deposit Capture Hot Just Got Hotter,” ABA Banking Journal, Mar. 2006, p. 1-9.
Vaream, Craig, “Image Deposit Solutions: Emerging Solutions for More Efficient Check Processing,” JP Morgan Chase, Nov. 2005 (16 pgs).
Wade, Will, “Early Debate on Remote-Capture Risk,” American Banker, New York, NY, May 26, 2004 (3 pgs).
Wade, Will, “Early Notes: Updating Consumers on Check 21” American Banker Aug. 10, 2004 (3 pgs).
Wallison, Peter J., “Wal-Mart Case Exposes Flaws in Banking-Commerce Split”, American Banker, vol. 167. No. 8, Jan. 11, 2002 (3 pgs).
Wells Fargo 2005 News Releases, “The New Wells Fargo Electronic Deposit Services Break Through Banking Boundaries In The Age of Check 21”, San Francisco Mar. 28, 2005, www.wellsfargo.com/press/3282005_check21Year=2005 (1 pg).
Wells Fargo Commercial, “Remote Deposit”, www.wellsfargo.com/com/treasury mgmtlreceivables/electronic/remote deposit, Copyright 2008 (1 pg).
White, J.M. et al., “Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction”, IBM J. RES. Development, Jul. 1983, vol. 27, No. 4 (12 pgs).
Whitney et al., “Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services”, American Bankers Association, May 18, 2004, http://www.aba/com/NR/rdonlyres/CBDC1 A5C-43E3-43CC-B733-BE417C638618/35930/DSTUFormat.pdf (2 pages).
Wikipedia®, “Remote Deposit,” http://en.wikipedia.org/wiki/Remote_deposit, 2007 (3 pgs).
Windowsfordevices.com, “Software lets camera phone users deposit checks, pay bills”, www.windowsfordevices.com/news/NS3934956670.html, Jan. 29, 2008 (3 pgs).
Wolfe, Daniel, “Check Image Group Outlines Agenda,” American Banker, New York, N.Y.: Feb. 13, 2009, vol. 174, Iss. 30, p. 12. (2 pgs).
Woody Baird Associated Press, “Pastor's Wife got Scammed—She Apparently Fell for Overseas Money Scheme,” The Commercial Appeal, Jul. 1, 2006, p. A. 1.
Zandifar, A., “A Video-Based Framework for the Analysis of Presentations/Posters,” International Journal on Document Analysis and Recognition, Feb. 2, 2005, 10 pages.
Zhang, C.Y., “Robust Estimation and Image Combining” Astronomical Data Analysis Software and Systems IV, ASP Conference Series, 1995 (5 pgs).
Zions Bancorporation, “Moneytech, the technology of money in our world: Remote Deposit,” http://www.bankjunior.com/pground/moneytech/remote_deposit.jsp, 2007 (2 pgs).
“Quicken Bill Pay”, Retrieved from the Internet on Nov. 27, 2007 at: <URL:http://quicken intuit.com/quicken-bill-pay-jhtml>, 2 pgs.
“Start to Simplify with Check Imaging a Smarter Way to Bank”, Retrieved from the Internet on Nov. 27, 2007, at: <URL: http://www.midnatbank.com/Internet%20Banking/internet_Banking.html>, 3 pgs.
Motomanual, MOTOROKR-E6-GSM-English for wireless phone, copyright 2006, 144 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 15, dated May 1, 2019, 7 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Defendant's Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Apr. 25, 2019, 36 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Patent Owner's Sur-Reply Brief to Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper 14, dated Apr. 30, 2019, 7 pgs.
USAA's Reply to Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 2, 2019, 15 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiffs Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 26, 2019, 18 pgs.
USAA's Reply Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 2, 2019, 227 pgs.
Parties' P.R. 4-5(D) Joint Claim Construction Chart, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated May 9, 2019, 25 pgs.
CBM2019-00002 U.S. Pat. No. 9,818,090, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Apr. 26, 2019, 5 pgs.
CBM2019-00003 U.S. Pat. No. 9,336,517, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 28 pgs.
CBM2019-00004 U.S. Pat. No. 8,977,571, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated May 15, 2019, 33 pgs.
CBM2019-00005 U.S. Pat. No. 8,699,779, Decision Denying Institution of Covered Business Method Patent Review 37 C.F.R. § 42.208, dated Jun. 3, 2019, 27 pgs.
USAA's Opening Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 17, 2019, 32 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 111 pgs.
Plaintiffs Notice of Filing Claim Construction Presentation, filed in Civil Action No. 2:18-CV-245, dated May 23, 2019, 106 pgs.
IPR2019-01081 U.S. Pat. No. 9,336,517, Petition for Inter Partes Review of Claims 1, 5-10, 12-14, 17-20 of U.S. Pat. No. 9,336,517, dated Jun. 5, 2019, 78 pgs.
IPR2019-01082 U.S. Pat. No. 8,977,571, Petition tor Inter Partes Review of Claims 1-13 U.S. Pat. No. 8,977,571, dated Jun. 5, 2019, 75 pgs.
IPR2019-01083 U.S. Pat. No. 8,699,779, Petition for Inter Partes Review of Claims 1-18 U.S. Pat. No. 8,699,779, dated Jun. 5, 2019, 74 pgs.
Plaintiffs Notice of Decisions Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated Jun. 6, 2019, 61 pgs.
Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 13, 2019, 48 pgs.
Parties' P.R.4-5(D) Joint Claim Construction Chart, filed in Civil Action No. 2:18-CV-245, dated Jun. 14, 2019, 28 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
USAA's Reply Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated Jun. 7, 2019, 14 pgs.
Wells Fargo's Objections to Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 7 pgs.
USAA's Objections to Magistrate Judge Payne's Claim Construction Memorandum Opinion and Order, filed in Civil Action No. 2:18-CV-245, dated Jun. 27, 2019, 6 pgs.
Parties' P.R. 4-5(D) Joint Claim Construction Chart, filed in Civil Action No. 2:18-CV-366, dated Jun. 18, 2019, 27 pgs.
IPR2019-00815, Invalidity Chart, uploaded on Jun. 27, 2019, 94 pgs.
IPR2019-00815, United Services Automobile Association (“USAA”)'s Patent Owner Preliminary Response, dated Jun. 27, 2019, 66 pgs.
IPR2019-00815, Supplemental Invalidity Chart, dated on Jun. 27, 2019, 16 pgs.
IPR2019-00815, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jun. 27, 2019, 25 pgs.
CBM 2019-00027, Declaration of Bharat Prasad, dated Jul. 8, 2019, 32 pgs.
CBM 2019-00027, Patent Owner Preliminary Response and Exhibits 2001-1042, dated Jul. 8, 2019, 91 pgs.
CBM 2019-00028, United Services Automobile Association (“USAA”)'s Patent Owner Preliminary Response, dated Jul. 8, 2019, 73 pgs.
CBM2019-00028, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 8, 28 pgs.
CBM2019-00028, Malykhina, Elena “Get Smart”, Copyright 2006 by ProQuest Information and Learning Company, 6 pgs.
CBM2019-00028, Palm Treo 700W Smartphone manual, Copyright 2005 by Palm, Inc., 96 pgs.
CBM2019-00028, 00000 C720w User Manual for Windows Mobile Smart Phone, Copyright 2006, 352 pgs.
CBM2019-00028, “Smarter Than Your Average Phone”, Copyright 2006 by Factiva, 4 pgs.
CBM2019-00028, “64 Million Smart Phones Shipped Worldwide in 2006”, Canalys Newsroom, 2006, 3 pgs.
CBM2019-00028, Nokia 9500 Communicator user Guide, Copyright 2006 by Nokia Corporation, 112 pgs.
CBM2019-00028, Robinson, Daniel, “Client Week—Handsets advance at 3GSM”, Copyright 2004 by VNU Business Publications Ltd., 2 pgs.
CBM2019-00028, Burney, Brett “MacBook Pro with Intel processor is fast, innovative”, Copyright 2006 by Plain Dealer Publishing Co., 2 pgs.
CBM2019-00028, 17-inch MacBook Pro User's Guide, Copyright 2006 by Apple Computer, Inc., 144 pgs.
CBM2019-00028, Wong, May “HP unveils new mobile computers”, Copyright 2006 by The Buffalo News, 2 pgs.
CBM2019-00028, Jewell, Mark “Cell Phone Shipments Reach Record 208M”, Copyright 2005 by Associated Press, 1 pg.
CBM 2019-00028, Lawler, Ryan “Apple shows Intel-based Macs, surge in revenue”, Copyright 2006 by The Yomiuri Shimbun, 2 pgs.
CBM 2019-00028, Aspire 9800 Series User Guide, Copyright 2006 by Acer International, 122 pgs.
CBM 2019-00028, Dell XPS M1210 Owner's Manual, Copyright 2006 by Dell Inc., 192 pgs.
CBM 2019-00028, Estridge, Bonnie “Is your phone smart enough?: The series that cuts through the technobabble to bring you the best advice on the latest gadgets”, Copyright 2006 by XPRESS—A1 Nsr Media, 3 pgs.
CBM 2019-00028, “Motorola, Palm collaborate on smart phone”, Copyright 2000 by Crain Communications, Inc., 1 pg.
CBM 2019-00028, Nasaw, Daniel “Viruses Pose threat to ‘Smart’ Cellphones —Computer Programs Could Cripple Devices and Shut Down Wireless Networks”, Copyright 2004 by Factiva, 2 pgs.
CBM 2019-00028, Seitz, Patrick “Multifunction Trend Shaking Up The Handheld Device industry; Solid Sales Expected in 2004; PDA, handset, camera—one single, small product can fill a variety of roles”, Copyright 2004 Investor's Business Daily, Inc., 3 pgs.
Microsoft Mobile Devices Buyer's Guide, 2002, 4 pgs.
Microsoft Mobile Devices Smartphone, 2003, 2 pgs.
Plaintiffs Notice of Decision Denying Institution of Covered Business Method Patent Review, filed in Civil Action No. 2:18-CV-245, dated May 15, 2019, 36 pgs.
Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated Jun. 24, 2019, 28 pgs.
CBM2019-00029, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Jul. 17, 2019, 76 pgs.
CBM2019-00029, Declaration of Matthew A. Calman in Support of Patent Owner Preliminary Response, dated Jul. 17, 2019, 29 pgs.
CBM2019-00029, Defendant's Claim Construction Brief, filed in Civil Action No. 2:18-CV-366, dated May 31, 2019, 28 pgs.
CBM2019-00029, Palenchar, Joseph, “PDA Phone Adds WiFi VoIP, Turn-By-Turn GPS Navigation”, Copyright 2006 by Reed Business Information, 2 pgs.
CBM2019-00029, HP User Guide, Additional Product Information, Copyright 2006 by Hewlett-Packard Development Company, L.P., 204 pgs.
CBM2019-00029, Pocket PC User Manual, Version 1, dated May 2006 by Microsoft, 225 pgs.
CBM2019-00029, “Dynamism.com: Take tomorrow's tech home today with Dynamism.com: Latest gadgets merge next generation technology with high style design”, Copyright 2006 Normans Media Limited, 2 pgs.
IPR2019-00815, Federal Reserve Financial Services Retired: DSTU X9.37-2003, Specifications for Electronic Exchange of Check and Image Data, Copyright 2006 by Accredited Standards Committee X9, Inc., dated Mar. 31, 2003, 157 pgs.
IPR2019-01081, Declaration of Peter Alexander, Ph.D, dated Jun. 5, 2019, 135 pgs.
USAA's Opening Claim Construction Brief, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Apr. 11, 2019, 32 pgs.
P.R. 4-3 Joint Claim Construction and Pre-Hearing Statement, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 5, 2019, 190 pgs.
Defendant Wells Fargo Bank, N.A.'s Amended Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-366, dated Apr. 12, 2019, 32 pgs.
Plaintiff and Counterclaim Defendant's Answer to Defendant and Counterclaims Plaintiffs Amended Answer, Affirmative Defenses, & Counterclaims, United Services Automobile Association v. Wells Fargo Bank, N.A., Civil Action No. 2:18-cv-245, dated Mar. 21, 2019, 36 pgs.
Higgins, Ray et al., “Working With Image Cash Letters (ISLs) X9.37, 180 or 187 files”, All My Papers, 2009, 36 pgs.
X9.100-180, “The New ICL Standard is Published”, All My Papers, 2006, 3 pgs.
X9.37 Specifications | X9Ware LLC, dated 2018, 3 pgs.
“Getting Started with ICLs aka X9.37 Files”, All My Papers, May 2, 2006, 39 pgs.
Federal Reserve Banks Plan Black-and-White Image Standard and Quality Checks, May 2004, 2 pgs.
Caplan, J. et al., Most Influential Gadgets and Gizmos 2002: Sanyo SCP-5300, 2002, 1 pg.
Hill, S., “From J-Phone to Lumina 1020: A complete history of the camera phone”, Digital Trends, 2020, 9 pgs.
Hoffman, J., “Before there Were Smartphones, There was I-Mode”, 1999, 5 pgs.
“Vodafane calls on mobiles to go live!”, 2002, 8 pgs.
“Sprint PCS Vision Guide”, 2005, 86 pgs.
FDIC—Remote Capture: A Primer, 2009, 3 pgs.
Callaham, J., “The first camera phone was sold 20 years ago, and it's not what you expect”, Android Authority, 2019, 5 pgs.
Fujisawa, H. et al., “Information Capturing Camera and Developmental Issues”, IEEE Xplore, downloaded on Aug. 18, 2020, 4 pgs.
Rohs, M. et al., “A Conceptual Framework for Camera Phone-based Interaction Techniques”, in Pervasive Computing, Berlin Heidelberg, 2005, pp. 171-189.
Koga, M. et al., Camera-based Kanji OCR for Mobile-phones: Practical Issues, IEEE, 2005, 5 pgs.
Parikh, T., “Using Mobile Phones for Secure, Distributed Document Processing in the Developing World”, IEE Persuasive Computing, vol. 4, No. 2, 2005, 9 pgs.
Parikh, T., “Mobile Phones and Paper Documents: Evaluating a New Approach for Capturing Microfinance Data in Rural India”, CHI 2006 Proceedings, 2006, 10 pgs.
Magid, L., “A baby girl and the camera phone were born 20 years ago”, Mercury News, 2017, 3 pgs.
Liang, J. et al., “Camera-based analysis of text and documents: a survey”, IJDAR, vol. 7, 2005, pp. 84-104, 21, pgs.
Gutierrez, L., “Innovation: From Campus to Startup”, Business Watch, 2008, 2 pgs.
Doermann, D. et al., “The function of documents”, Image and Vision Computing, vol. 16, 1998, pp. 799-814.
Mirmehdi, M. et al., “Towards Optimal Zoom for Automatic Target Recognition”, in Proceedings of the Scandinavian Conference on Image Analysis, 1:447-454, 1997, 7 pgs.
Mirmehdi, M. et al., “Extracting Low Resolution Text with an Active Camera for OCR”, in Proccedings of the IX Spanish Symposium On Pattern Recognition and Image Processing (pp. 43-48), 2001, 6 pgs.
Zandifar, A. et al., “A Video Based Interface To Textual Information For The Visually Impaired”, IEEE 17th International Symposium On Personal, Indoor and Mobile Radio Communications, 1-5, 2002, 6 pgs.
Laine, M. et al., “A Standalone OCR System For Mobile Cameraphones”, IEEE, 2006, 5 pgs.
Federal Reserve Banks to Adopt DSTU X9.37-2003 Format for Check 21 Image Services, 2004, 2 pgs.
Dhandra, B.V. et al., “Skew Detection in Binary Image Documents Based on Image Dilation and Region labeling Approach”, IEEE, The 18th International Conference on pattern Recognition (ICPR'06), 2006, 4 pgs.
PNC Bank to Offer Ease of Online Deposit Service Integrated with QuickBooks to Small Business, RemoteDepositCapture.com, Jul. 24, 2006, 2 pgs.
Andrew S. Tanenbaum, Modern Operating Systems, Second Edition (2001).
Arnold et al, The Java Programming Language, Fourth Edition (2005).
Consumer Assistance & Information—Check 21 https://www.fdic.gov/consumers/assistance/protection/check21.html (FDIC).
Halonen et al., GSM, GPRS, and EDGE Performance: Evolution Towards 3G/UMTS, Second Edition (2003).
Heron, Advanced Encryption Standard (AES), 12 Network Security 8 (2009).
Immich et al., Performance Analylsis of Five Interprocess CommunicAtion Mechanisms Across UNIX Operating Systems, 68 J. Syss. & Software 27 (2003).
Leach, et al., A Universally Unique Identifier (UUID) URN Namespace, (Jul. 2005) retrieved from https://www.ietf.org/rfc/rfc4122.txt.
N. Ritter & M. Ruth, The Geo Tiff Data InterchAnge Standard for Raster Geographic Images, 18 Int. J. Remote Sensing 1637 (1997).
Pbmplus—image file format conversion package, retrieved from https://web.archive.org/web/20040202224728/https:/www.acme.com/software/pbmplus/.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01071, 106 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-7, 10-21 and 23 of U.S. Pat. No. 10,482,432, dated Jul. 14, 2021, IPR2021-01074.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, dated Jul. 21, 2021, IPR2021-01076, 111 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-18 of U.S. Pat. No. 10,621,559, filed Jul. 21, 2021, IPR2021-01077; 100 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of Claims 1-30 of U.S. Pat. No. 10,013,681, filed Aug. 27, 2021, IPR2021-01381, 127 pages.
Petition filed by PNC Bank N.A. for Inter Partes Review of U.S. Pat. No. 10,013,605, filed Aug. 27, 2021, IPR2021-01399, 113 pages.
Readdle, Why Scanner Pro is Way Better Than Your Camera? (Jun. 27, 2016) retrieved from https://readdle.com/blog/why-scanner-pro-is-way-better-than-your-camera.
Santomero, The Evolution of Payments in the U.S.: Paper vs. Electronic (2005) retrieved from https://web.archive.org/web/20051210185509/https://www.philadelphiafed.org/publicaffairs/speeches/2005_santomero9.html.
Schindler, Scanner Pro Review (Dec. 27, 2016) retrieved from https://www.pcmag.com/reviews/scAnner-pro.
Sing Li & Jonathan Knudsen, Beginning J2ME: From Novice to Professional, Third Edition (2005), ISBN (pbk): 1-59059-479-7, 468 pages.
Wang, Ching-Lin et al. “Chinese document image retrieval system based on proportion of black pixel area in a character image”, the 6th International Conference on Advanced Communication Technology, 2004, vol. 1, IEEE, 2004.
Zaw, Kyi Pyar and Zin Mar Kyu, “Character Extraction and Recognition for Myanmar Script Signboard Images using Block based Pixel Count and Chain Codes” 2018 IEEE/ACIS 17th International Conference on Computer and Information Science (CS), IEEE, 2018.
Jung et al, “Rectangle Detection based on a Windowed Hough Transform”, IEEE Xplore, 2004, 8 pgs.
Craig Vaream, “Image Deposit Solutions” Emerging Solutions for More Efficient Check Processing, Nov. 2005, 16 pages.
Certificate of Accuracy related to Article entitled, “Deposit checks by mobile” on webpage: https://www.elmundo.es/navegante/2005/07/21/empresas/1121957427.html signed by Christian Paul Scrogum (translator) on Sep. 9, 2021.
Fletcher, Lloyd A., and Rangachar Kasturi, “A robust algorithm for text string separation from mixed text/graphics images”, IEEE transactions on pattern analysis and machine intelligence 10.6 (1988), 910-918 (1988).
IPR 2022-00076 filed Nov. 17, 2021 on behalf of PNC Bank N.A., 98 pages.
IPR 2022-00075 filed Nov. 5, 2021 on behalf of PNC Bank N.A., 90 pages.
IPR 2022-00050 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 126 pages.
IPR 2022-00049 filed Oct. 22, 2021 on behalf of PNC Bank N.A., 70 pages.
About Network Servers, GlobalSpec (retrieved from https://web.archive.orq/web/20051019130842/http://globalspec.com 80/LearnMore/Networking_Communication_Equipment/Networking_Equipment/Network_Servers (“GlobalSpec”).
FDIC: Check Clearing for the 21st Century act (Check21), FED. Deposit Ins. Corp., Apr. 25, 2016 (retrieved from https://web.archive.org/web/20161005124304/https://www.fdic.gov/consumers/assistance/protection/check21.html (“FDIC”).
Continuations (4)
Number Date Country
Parent 16368202 Mar 2019 US
Child 16707925 US
Parent 15884990 Jan 2018 US
Child 16368202 US
Parent 15014918 Feb 2016 US
Child 15884990 US
Parent 14056565 Oct 2013 US
Child 15014918 US