Checks typically provide a safe and convenient method for an individual such as a payor to transfer funds to a payee. To use a check, the individual usually opens a checking account, or other similar account, at a financial institution and deposits funds, which are then available for later withdrawal. To transfer funds with a check, the payor usually designates a payee and an amount payable on the check. In addition, the payor often signs the check. Once the check has been signed, it is usually deemed negotiable, meaning the check may be validly transferred to the payee upon delivery. By signing and transferring the check to the payee, the payor authorizes funds to be withdrawn from the payor's account on behalf of the payee.
While a check may provide a payor with a convenient and secure form of payment, receiving a check may put certain burdens on the payee, such as the time and effort required to deposit the check. For example, depositing a check typically involves going to a local bank branch and physically presenting the check to a bank teller. To reduce such burdens for the payee, systems and methods have been developed to enable the remote deposit of checks.
For example, the payee may capture a digital image of a check using a mobile device. The financial institution may then receive from the payee the digital image of the check. The financial institution may then use the digital image to credit funds to the payee. However, such a technique requires the efficient and accurate detection and extraction of the information pertaining to a check in the digital image. Capturing a digital image at a mobile device that allows for detection and extraction of the information from the digital image may be difficult due to focus, resolution, glare and other typical challenges in optical image capture.
In an implementation, a system is provided for depositing a document, such as a negotiable instrument or any other type of document related to a commercial or non-commercial transaction. The system includes a mobile device having a camera and a processor, wherein the processor is configured to capture a plurality of images of the same side of a document with the camera, analyze the plurality of images to identify a plurality of acceptable portions within the plurality of images, wherein the acceptable portions of each image satisfy predetermined image quality criteria, combine the plurality of acceptable portions to generate a composition composite image of the document and transmit the composition composite image of the document from the mobile device to a depository via a communication pathway between the mobile device and the depository.
In another implementation, a system is provided for depositing a document, comprising a mobile device having a camera and a processor, wherein the processor is configured to capture an image of the document with the camera, analyze the image of the document to identify a designated portion of the document to recapture, recapture an image of the identified designated portion of the document, wherein the designated portion fails to satisfy predetermined image quality criteria, replace the designated portion of the document with the recaptured image to generate a composite image of the document, and transmit the composite image of the document from the mobile device to a depository via a communication pathway between the mobile device and the depository.
In another implementation, a system is provided for depositing a document, comprising a server having a processor and memory, wherein the processor is configured to receive a plurality of images of the same side of a document from a mobile device, analyze the plurality of images to identify a plurality of acceptable portions within the plurality of images, wherein the acceptable portions of each image satisfy predetermined image quality criteria, combine the plurality of acceptable portions together to generate a composite image of the document and transmit the composite image of the document from the server to a depository via a communication pathway between the server and the depository.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The foregoing summary, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the embodiments, there are shown in the drawings example constructions of the embodiments; however, the embodiments are not limited to the specific methods and instrumentalities disclosed. In the drawings:
In the following detailed description of example embodiments, reference is made to the accompanying drawings, which form a part hereof and in which is shown, by way of illustration, specific embodiments in which the example methods, apparatuses, and systems may be practiced. It is to be understood that other embodiments may be used and structural changes may be made without departing from the scope of this description.
As described further below, a system and method is disclosed for enhancing image capture for remote deposit capture of documents, such as negotiable instruments or any other type of documents related to a commercial or non-commercial transaction. With respect to the description herein, a document may be or include an agreement, a contract, a commercial paper, a document of title, money, a negotiable instrument, a security interest, a transaction document, or any logical combination thereof. In an example, a document may be or include a check, a money order, a unit of money, a verification card (such as social security card, a driver's license, or a student identification), or any document with an account number (such as a bill or receipt of payment). The document could also be a loan application (such as a mortgage application), for example.
In one implementation, a mobile device captures multiple pictures of the same side of a check. The pictures are analyzed to identify portions of the pictures that are suitable for processing. The acceptable portions of two or more pictures are combined to generate a composite image of the check for processing.
In another implementation, a mobile device captures a picture of a check. The picture is analyzed to identify a portion of the picture that is not suitable for processing and that will be recaptured at a higher resolution. The recaptured portion is combined with the original picture to generate a composite image of the check for processing.
In an example, a document can include a type of contract that obligates one party to pay a specified sum of money to another party. In an example, a document can be an unconditioned writing that promises or orders payment of a fixed amount of money. In an example, a document is a check. In such an example, the check may be taken by the receiving party and deposited into an account at a financial institution of the receiving party. The receiving party may endorse the check and then present it for deposit at a bank branch, via an automated teller machine (ATM), or by using remote deposit. Some other examples of documents may include money orders, cashier's checks, drafts, bills of exchange, promissory notes, and the like. A money order is a trusted financial instrument that is a payment order for a pre-specified amount of money. A cashier's check (also known as a bank check, official check, teller's check, bank draft or treasurer's check) is a check guaranteed by a bank and may be purchased from a bank.
The user 102 may be an individual or entity who owns account 160 that may be held at financial institution 130. Account 160 may be any type of deposit account for depositing funds, such as a savings account, a checking account, a brokerage account, and the like. The user 102 may deposit a check 108 or another type of document (such as another type of negotiable instrument) in the account 160 either electronically or physically. In such an example, the financial institution 130 may process and/or clear the check 108 or other type of document. The user 102 may communicate with financial institution 130 by way of communications network 120 such as an intranet, the Internet, a local area network (LAN), a wide area network (WAN), a wireless fidelity (WiFi) network, a public switched telephone network (PSTN), a cellular network, a voice over Internet protocol (VoIP) network, and the like. The user 102 may communicate with financial institution 130 by phone, email, instant messaging, text messaging, web chat, facsimile, mail, and the like. Financial institutions 130, 140, and 150 also may communicate with each other by way of communications network 120.
In an implementation, the user 102 may receive payment from another individual such as a payor in the form of a check 108 or other type of document that is drawn from account 170 at financial institution 150. The user 102 may endorse the check 108 (e.g., sign the back of the check 108) and indicate an account number on the check 108 for depositing the funds. It is noted that although examples described herein may refer to a check, the techniques and systems described herein are contemplated for, and may be used for, deposit of any document. Similarly, the techniques and systems described herein are contemplated for and may be used with any form or document whose image may be captured with a camera or other imaging device of a mobile device for subsequent storage and/or processing.
As described further herein, a digital image of a check or other document may be provided from a user to a financial institution, and the digital image may be processed and funds associated with the check or document in the digital image may be deposited in a user's bank account. The user 102 may deposit the check 108 into account 160 by making a digital image of the check 108 and sending the image file containing the digital image to financial institution 130. For example, after endorsing the check 108, the user 102 may use a mobile device 106 that comprises a camera to convert the check 108 into a digital image by taking a picture of the front and/or back of the check 108. The mobile device 106 may be a mobile phone (also known as a wireless phone or a cellular phone), a personal digital assistant (PDA), or any handheld computing device, for example. Aspects of an example mobile device are described with respect to
To increase the likelihood of capturing a digital image of the check 108 that may be readable and processed such that the check 108 can be cleared, the image is monitored for compliance with one or more monitoring criteria, prior to the image of the check 108 being captured. The monitoring criteria may be directed to proper lighting and/or framing of the check 108 in an image of the check 108 that will be captured and presented for clearing of the check 108. An application may monitor whether the check 108 is sufficiently within the frame of the camera and has a high enough quality for subsequent processing. The monitoring is performed with respect to the image as it appears in the field of view of the camera of the mobile device 106. The field of view is that part of the world that is visible through the camera at a particular position and orientation in space; objects outside the field of view when the image is captured are not recorded in the image. The monitoring criteria may be based on one or more of light contrast on the image, light brightness of the image, positioning of the image, dimensions, tolerances, character spacing, skewing, warping, corner detection, and MICR (magnetic ink character recognition) line detection, as described further herein. The monitoring may be performed by the camera, the mobile device 106, and/or a financial institution that is in communication with the mobile device 106. Feedback may be provided to the user 102 regarding the image of the check in the field of view. Based on the feedback, the user 102 may reposition the check 108 and/or the camera, for example, or may capture a plurality of images of the check 108. For examples of monitoring criteria and feedback provided to the user, U.S. Pat. No. 8,699,779 and U.S. patent application Ser. Nos. 14/224,944, 14/516,335, 14/516,350, 14/516,364 13/922,686 and 12/545,127 are hereby incorporated by reference.
In an implementation, when the image of the check 108 in the field of view passes the monitoring criteria, a plurality of images may be automatically taken by the camera of the same side of check 108. A plurality of images of the same side of check 108 is captured to provide a greater probability that the captured images will be sufficient to process an image of check 108 for deposit. By capturing a plurality of image of the same side of check 108, user frustration may be reduced and check processing and deposit may be streamlined by reducing the necessity for the user to recapture additional images of same side of check 108. The plurality of images may be analyzed to identify acceptable portions within the plurality of captured images. The acceptable portions within the plurality of captured images may be identified based on satisfying a predetermined image quality criteria, may be the best quality image portions, may be the first portions found to satisfy a minimum quality threshold, or other like qualifications. Further, the acceptable portions may be identified based on a combination of aforementioned qualifications. The acceptable portions within the captured images may be combined or stitched together by mobile device 106 to generate a composite, composition or composed image (hereinafter, collectively referred to as “composite image”). The composite image may be provided from the mobile device 106 to a financial institution. By generating a composite image from the plurality of captured images, the number of non-conforming images of checks is reduced during presentment of the images to a financial institution for processing and clearing.
In an implementation, the image capture, analysis and/or combining may be performed automatically by the camera, the mobile device 106, and/or a financial institution as soon as the image of the check 108 is determined to pass the monitoring criteria or after the images captured. Alternatively, the user 102 may manually instruct the camera to perform the image capture (e.g., by pressing a button the camera or the mobile device 106) after the user 102 receives an indication or other feedback that the image passes the monitoring criteria, image analysis (e.g., by interfacing with the camera or the mobile device 106) after the images are captured and/or imaging combining (e.g., by interfacing with the camera or the mobile device 106) after the images are analyzed.
In an implementation, the composite image is further processed by the camera, the mobile device 106, and/or the financial institution to improve clarity of a blurry or otherwise unsatisfactory image.
In an example, a document (such as the check 108), can be scanned by the camera, so that the scan of the document can provide a height and width of the document (or at least a selected part of the document). These measurements can then be used by the the camera, the mobile device 106, and/or a financial institution to verify a document type (such as whether, the document type is a check, a driver's license, a unit of money, or a deposit notification). The verification can be used as a basis for the camera, the mobile device 106, and/or a financial institution to automate an electrical or optical analysis on the document. In examples, where the document type is identified, the camera, the mobile device 106, and/or a financial institution can analyze the document according to standardized and/or common features of the document type. For example, after identifying the document type, the camera, the mobile device 106, and/or a financial institution can be directed to certain relevant information of the document according to standardized or common locations of such information on such a document.
In an implementation, the plurality of images may be sent to the financial institution 130 using the mobile device 106. Any technique for sending a digital image to financial institution 130 may be used, such as providing a digital image to a website associated with financial institution 130 from storage, emailing a digital image to financial institution 130, or sending a digital image in a text message or instant message, for example. The financial institution 130 may generate a composite image from the plurality of images received from the user 102 via the mobile device 106.
In another implementation, when the image of the check 108 in the field of view passes the monitoring criteria, one or more images of the check 108 are automatically captured by the camera. An image may be analyzed to identify a designated portion of the check 108 within the captured image to be recaptured by the camera. The designated portion within the captured image of check 108 may be identified based on failing a predetermined image quality criteria, may fail to satisfy a minimum quality threshold, may be unintelligible, or may fail to satisfy other like qualifications. Further, the designated portion may be identified based on a combination of aforementioned qualifications. User 102 may be instructed to recapture the designated portion of check 108, preferably at a higher resolution. The designated portion of check 108 may be recaptured to provide a greater probability that the captured images will be sufficient to process an image of check 108 for deposit. By recapturing the designated portion of check 108, user frustration may be reduced and check processing and deposit may be streamlined by reducing the necessity for the user to recapture additional images of the entire check 108. The recaptured image may be combined with the captured image by mobile device 106 to generate a composite image of the check 108. The composite image may be provided from the mobile device 106 to a financial institution.
In an implementation, the designated portion of the check 108 may be identified for recapture by displaying a grid to user 102 indicating the designated portion of check 108 to be recaptured. Alternatively, the designated portion of the check 108 may be identified for recapture by displaying an alignment guide to user 102.
Financial institution 130 may receive one or more digital images representing the check 108 and may use any known image processing software or other application(s) to obtain the relevant data of the check 108 from the digital images. Financial institution 130 may determine whether the financial information associated therewith may be valid. For example, financial institution 130 may include any combination of systems and subsystems such as electronic devices including, but not limited to, computers, servers, databases, or the like. The electronic devices may include any combination of hardware components such as processors, databases, storage drives, registers, cache, random access memory (RAM) chips, data buses, or the like and/or software components such as operating systems, database management applications, or the like. According to an embodiment, the electronic devices may include a network-based server that may process the financial information and may receive the digital images from the user 102.
The electronic devices may receive the digital images and may perform an analysis on the quality of the digital image, the readability of the data contained therein, or the like. For example, the electronic devices may determine whether the account number, amount payable, and the like may be readable such that it may be parsed or otherwise obtained and processed by the financial institution to credit an account 160 associated with the user 102 and debit an account associated with the payor. In an implementation, a representative 135 of financial institution 130 may provide assistance to the user 102 and may provide assistance in determining whether the financial information may be readable and/or of a good enough quality to be processed.
Upon receipt and approval of a digital image, financial institution 130 may credit the funds to account 160. Financial institution 130 may clear the check 108 by presenting a digital image of the check 108 captured from the digital image to an intermediary bank, such as a regional branch of the Federal Reserve, a correspondent bank, and/or a clearinghouse bank. For example, the check 108 may be cleared by presenting the digital image to financial institution 140, which may be a regional branch of the Federal Reserve, along with a request for payment. Financial institutions 130 and 150 may have accounts at the regional branch of the Federal Reserve. Financial institution 130 may create a substitute check using the image provided by the user 102 and present the substitute check to financial institution 140 for further processing. Upon receiving the substitute check, financial institution 140 may identify financial institution 150 as the paying bank (e.g., the bank from which the check 108 is drawn). This may be accomplished using a nine digit routing number located on the bottom left hand corner of the check. A unique routing number is typically assigned to every financial institution in the United States. Financial institution 140 may present the substitute check to financial institution 150 and request that the check be paid. If financial institution 150 verifies the check (i.e., agrees to honor the check), financial institution 140 may then settle the check by debiting funds from financial institution 150 and crediting funds to financial institution 130. Financial institution 150 may then debit funds from account 170.
It will be appreciated that the preceding examples are for purposes of illustration and explanation only, and that an embodiment is not limited to such examples. For example, financial institution 150 may be a correspondent bank (i.e., engaged in a partnership with financial institution 130). Thus, financial institution 130 may bypass the regional branch of the Federal Reserve and clear the check directly with financial institution 150. In addition, account 160 and account 170 may both be held at financial institution 130, in which case the check 108 may be cleared internally.
In an implementation, the mobile device 106 may comprise a video source such as a video camera, a web camera, or a video-enabled phone, for example, to obtain a video of the check 108. A frame of the video may be obtained and monitored with respect to monitoring criteria, as described further herein. The mobile device 106 and/or the institution may obtain the frame and monitor the frame, depending on an implementation. Generation of a live video of a check 108 is not limited to a video camera, a web camera, and a video-enabled phone, and it is contemplated that any device that is capable of generating a live video may be used to make a video of the check 108 which may be monitored in real-time with respect to monitoring criteria. Additional devices that may be used in the generation and/or transmission of a live video include a web-enabled video computing device, a mobile phone, a camcorder, and a computer camera, for example.
In an implementation, the mobile device 106 may comprise a camera 207, such as a digital camera. Such a mobile device may be called a smart phone or camera phone.
In an implementation, prior to camera 207 capturing an image in its field of view, the image may be monitored with respect to monitoring criteria, e.g., using a software application running on the mobile device 106. Feedback based on the monitoring of the image may be provided to the user 102 to assist the user 102 in positioning the check 108 so that the image of the check 108 may be captured in such a manner that it may be more easily processed and cleared during subsequent operations, such as those involving one or more financial institutions.
A depository 204 may include a bank in which the user 102 has a deposit account; however, the present disclosure is not limited to just banks. Alternatively, a third party may act as the depository 204 providing functionality to a plurality of users without regard to the bank at which they have deposit accounts, or whether their individual bank allows for the methods and systems described herein. In an implementation, the depository 204, after receiving the image(s) of the check 108 from the user 102, may use a clearinghouse 210 to perform the check clearing operations. As described with respect to the system 100 of
In an implementation, the user 102 may place the check 108 on a background and generate one or more digital images comprising an image of the check (e.g., a check image) and a portion of the background (e.g., a background image) using the camera 207. Any background may be used, although a dark background or a consistently colored background may provide more optimal results. It is noted that although examples and implementations described herein may refer to a check image and check data, the term “check image” may refer to any foreground image in a digital image (as opposed to the background image) and the term “check data” may refer to any foreground data in a digital image (as opposed to background data). Thus, the “check image” and the “check data” may refer to the foreground image and foreground data in implementations involving any type of document.
In an implementation, the image being monitored in the field of view of the camera 207 comprises check data and background data. The check data pertains to the check image and the background data pertains to the background image (e.g., the background on which the check image is disposed).
The user of the mobile device 106 may introduce distortions in the image via camera 207 due to a perspective problem, specifically an angling of the camera vertically over the check, and the top of the check is smaller than the bottom, or the reverse. Skewing occurs when the check 208 is rotated from the horizontal in the image 230. Warping, as used herein, is meant to denote that the check 108 is tilted forward or back with respect to a plane that is perpendicular to a line drawn from the camera lens to the center of the check 108. Warping, or tilting, of the image may lead to incorrect optical detection of the check 108.
The operator of the camera 207 may also introduce image problems due to the light in the image 230, such as the light contrast and/or light brightness found on the image 230, such as in various regions of the image 230. For example, the light contrast between the check image 247 and the background image 250 may be insufficient to render an acceptable image. As another example, the light brightness on various regions of the image may be inconsistent compared to each other and may prevent the entirety of the image from being properly processed. In yet another example, if the difference between the light brightness of the various regions is to low (e.g., the light brightness does not vary significantly among the regions), it may not be possible to process the image.
In one implementation, when the image of the check 108 in the field of view passes the monitoring criteria, a plurality of images may be captured by the camera 207. The mobile device 106 may capture the plurality of images from different perspectives, such as close range, long range and by user 102 manually tilting the camera in various directions. Mobile device 106 may also capture the plurality of images utilizing different settings of camera 207 while capturing each image, such as zoom, flash, anti-shake, aperture, f-stop, exposure, shutter speed, balance, effects, balance, contrast etc. The user 102 may maybe instructed to initiate the image capture via feedback indicator 235, by audible feedback, haptic feedback, etc. Alternatively, mobile device 106 may automatically capture the plurality of images without further intervention by user 102. The plurality of images may be taken individually, in a series (e.g., with a specified time delay or a specified sequence of actions between image captures), or in a burst.
Various portions of the captured images of check 108 may not be captured in a manner that can be processed for deposit. For example, certain areas of the captured images may be blurry, out of focus, overexposed, washed out, dimensions, contrast, taken in insufficient light, etc. However, other portions of the captured images may have been sufficiently captured for processing and are acceptable. In an implementation, the plurality of captured images may be analyzed to identify the acceptable portions within the plurality of captured images.
In an implementation, the acceptable portions within the plurality of images may be combined or stitched together by mobile device 106 to generate a composite image of type of the document. The composite image may resemble the document as a whole, such as check image 247. Alternatively, the composite image may reflect only the relevant portions of the document to processing, such as the MICR line, routing number, account number, check number, amount, payor name, payee name, signature line, endorsement signature, endorsement account number, etc. Generating a composite image as described herein may eliminate the necessity to recapture additional images of the document for processing.
For example, referring to
In an implementation, the composite image may be further processed by the camera 207, the mobile device 106, and/or the financial institution to improve clarity of a blurry or otherwise unsatisfactory image. Any technique for processing a digital image may be used, such as post-processing image compositing. Post-processing may also include optical character recognition (OCR), or may use any known image processing software or other application(s) to obtain the relevant data of the check 108 from the composite image.
In an example, post-processing can include manipulating the image such that the document is compliant with one or more certain guidelines or standards. Such a manipulation may include enhancing contrast of an image of the document so that features of the document are more visible. The manipulation may also include removing noise, so that features of the document are more identifiable than prior to the manipulation. The manipulation may also include cropping the image to remove a background portion from the image. OCR algorithms for reading features of a document can be used to enhance these example operations of the manipulation in post-processing.
In an implementation, when the image of the check 108 in the field of view passes the monitoring criteria, one or more images of the check 108 may be captured by the camera 207. Various portions of a captured image of check 108 may not have been captured in a manner that can be processed for deposit. For example, a portion of the captured image may be blurry, out of focus, overexposed, washed out, dimensions, contrast, taken in insufficient light, etc. In an implementation, an image of check 108 may be analyzed to identify a designated portion of the check 108 within the captured image that may not have been captured in a manner that can be processed for deposit.
In an implementation, the user 102 may be instructed to recapture the designated portion of the check 108. The designated portion of check 108 may be recaptured utilizing different settings of camera 207, such as zoom, flash, anti-shake, aperture, f-stop, exposure, shutter speed, balance, effects, balance, contrast etc. Preferably, the designated portion of check 108 may be recaptured at a higher resolution. New monitoring criteria and feedback may be utilized during recapture. In another implementation, the designated portion of the check 108 may be automatically recaptured without further intervention by user 102.
In an implementation, a grid or alignment guide may be overlaid on the camera feed of the mobile device 106. The grid or alignment guide may take any shape such as a bounding rectangle or other bounding box or shape, horizontal and/or vertical bars, parallel lines, etc., for example. In an implementation, the image 230 may be divided into portions by a grid, such as the grid shown in
In an implementation, a bounding rectangle, for example, may be used as the alignment guide, aligning the designated portion of check 108, thereby passing the new monitoring criterion, means enclosing the check 108 within the bounding rectangle. If the designated portion of check 108 is outside of the alignment guide in the image 230, feedback may be generated and provided to the user 102 regarding this new monitoring criterion with instructions for moving the check 108 or the camera 207 in order to properly align the designated portion of check 108 in the field of view. In an implementation, an indicated portion 280A2 of the grid shown in
In an implementation, the recaptured image of the designated portion of check 108 may be combined or stitched together with the captured image of check 108 to generate a composite image of the document. The composite image may resemble the document as a whole, such as check image 247. Alternatively, the composite image may reflect only the relevant portions check 108 for processing, such as the MICR line, routing number, account number, check number, amount, payor name, payee name, signature line, endorsement signature, endorsement account number, etc. Generating a composite image as described herein may eliminate the necessity to recapture a series of images of check 108 before processing.
In an implementation, the above embodiments may be combined. For example, in an implementation, when the image of the check 108 in the field of view passes the monitoring criteria, a plurality of images may be captured by the camera 207. The plurality of captured images may be analyzed to identify the acceptable portions within the plurality of captured images. The acceptable portions within the captured images may be combined or stitched together to generate a composite image of the document. The composite image may be analyzed to identify a designated portion of the check 108 within the captured image that may not have been captured in a manner that can be processed for deposit. The designated portion of check 108 may be recaptured by the camera 207. The recaptured image of the designated portion of check 108 may be combined or stitched together with the composite image to generate a new composite image of the document. The new composite image may be further processed to improve clarity of a blurry or otherwise unsatisfactory image. It is contemplated that the elements of a system in above implementations and the steps performed may be performed by any combination of the camera 207, the mobile device 106, and/or the financial institution.
The server 322, in one example, may send instructions 330 to the client 320 that execute an application on the client 320. This may include instructions that cause a software object, which may have been previously downloaded and installed (e.g., pre-installed) on the client 320, to be executed on the client 320. In another implementation, server 322 may transmit a software application, or app, to client 320 for installation. The software application may be made available to client 320 via an online depository, database or application (app) store such as Google Play, iTunes or the like. The software object may analyze the image in the field of view of a digital camera (e.g., the image 230 shown in the field of view of the camera 207 associated with the mobile device 106) with respect to one or more monitoring criteria and may generate and provide feedback to the user regarding the monitoring criteria and/or instructions for capturing one or a plurality of images of the check 108. The software object may automatically capture one or a plurality of images of the check 108 without any further intervention by the user.
In another example, the instructions 330 may include a wholly self-contained application that when delivered to the client 320 will execute and perform one or more operations described herein, such as those directed to analyzing the image in the field of view of the camera 207 with respect to monitoring criteria, providing feedback to the user 102, and capturing one or a plurality of images of the check 108. In either example, the software object may be configured to make one or more software calls 310 to the camera 207. This may be through specific software instructions to the camera 207. In other words, the camera's functionality may not be abstracted through any software library. In such an example, software code may be written and delivered to every different camera-equipped mobile phone.
In an alternate example, the software object may operate through a software abstraction layer, such as an application programming interface (API). The software object developer may only insert code into the software object to call one or more APIs exposed by the software operating the mobile device 106. One example of such software is Windows Mobile by Microsoft Corporation. In the context of a Windows Mobile device, the Windows Mobile operating system (OS) has one or more APIs exposed to application developers that will translate instructions from applications into instructions operable by the camera 207 on the mobile device 106. A mobile operating system, also known as a mobile platform or a handheld operating system, is the operating system that controls a mobile device. Other mobiles OSs include Symbian OS, iPhone OS, Palm OS, BlackBerry OS, and Android.
The software object may cause the camera 207 to analyze an image in the field of view with respect to monitoring criteria, provide feedback, and/or take a picture or capture one or a plurality of images of the check 108 being deposited. These images may be captured sequentially, e.g., pursuant to the user 102 flipping the check 108 over after an image of the front of the check 108 has been captured after passing the monitoring criteria. However, each side of the check 108 may be captured by the camera 207 using similar API calls. The images may be stored in an image file(s) 315.
Once the images of one or both sides of the check 108 pass the monitoring criteria and are captured by the camera 207, the image file(s) 315 may be analyzed by the software object of the client 320. The analysis of image file(s) 315 identifies the acceptable portions within the captured images and/or identifies designated portion(s) of the image file(s) 315 to be recaptured. Once the acceptable portions have been identified and/or the designated portion(s) have been recaptured, the portion(s) of the image files 315 are combined or stitched together to generate composite image file(s).
Once the images file(s) have been combined, the composite image file(s) may be operated on by the software object of the client 320. These operations may include any of the following: deskewing, dewarping, magnetic ink character recognition, cropping (either automatically, or having the user 102 manually identify the corners and/or edges of the check 108 for example), reducing the resolution of the image, number detection, character recognition, and the like.
With respect to number and character recognition, commercial check scanners have used characteristics of the MICR encoding to detect information about the check, such as the bank's routing number and the account number. However, the characteristics that these scanners have used are the magnetic characteristic of the ink itself and these scanners have used methods similar to those of magnetic audio tape readers. In an implementation, a software object of the client 320 may optically recognize the characters on the MICR line, as a consumer mobile device such as the mobile device 106 will lack the magnetic reading ability of a commercial check scanner.
The image may be also down converted into a grayscale or black and white image, such as either in Joint Photographic Experts Group (JPEG) compliant format or in tabbed image file format (TIFF) for example. In an alternate example, the image may be formatted as a Scalable Vector Graphics (SVG) image. One of the benefits of an SVG file is a large size advantage over JPEG. In the former example, the image at some point before entry into the clearing system may be converted to TIFF format. This may be performed at the mobile device 106, wherein the camera 207 captures the image in TIFF format. However, the camera 207 of the mobile device 106 may capture the image in JPEG format, which may then be converted into TIFF either at the mobile device 106 or at the server 322. In the latter example, this may use the transmission of the TIFF image across a communications network which may be more advantageous as TIFF images are typically smaller in file size for the same size of picture as a JPEG formatted image.
The software object on the client 320 may operate by performing one or more of the operations described herein and then transmitting an image file 335 (e.g., based on image file 315 that has been processed) to the server 322 after the user 102 confirms that they do wish to deposit the check 108. Alternately, the software object may capture the images of the check 108 and transmit that images to the server 322 that in turn may perform those operations, verifies that the image quality is within acceptable thresholds, and communicates that verification back to the client 320, which can then instruct the user 102 to take pictures of the other side of the check 108. In this example, the images transmitted to the server 322 may be in any format, such as JPEG or TIFF, insofar as the server software has the ability to convert that image into a Check 21 compliant format. Alternately, the bank may output an X9.37 file to the clearing system. The Check Clearing for the 21st Century Act (or Check 21 Act) is a United States federal law that allows the recipient of a paper check to create a digital version, thereby eliminating the need for further handling of the physical document. The Check 21 standard for electronic exchange is defined in the standard DSTU X9.37-2003 (“X9.37”). It is a binary interchange format.
The server 322 may confirm (e.g., using a process confirmation 340) with the user 102 the transmission, reception, and processing of each side of the check 108 separately, or may confirm both sides at the same time. On the server side, more operations may be performed, such as signature verification. Where to perform these operations may be determined by the processing power of the mobile device 106 itself, which is typically limited in computational power. However, the present discussion is not limited in any way by discussion of where certain operations are described as operating. The operations of detecting and verifying information may be performed by the client 320 before the information is transmitted along with the image in the image file 335 to the server 322. Alternately, the software object(s) operating on the mobile device 106 may perform no operation other than capturing images of the front and back of the check 108 after passing the monitoring criteria, receiving confirmation that the user 102 wishes to proceed, and transmitting those images to the server 322, wherein the server 322 performs those operations.
In an implementation, after the image file(s) 335 have been received by the server 322, the server 322 may send a process confirmation 340 to the client 320. The process confirmation 340 may request instructions from the client 320 to continue proceeding with the deposit now that the server 322 has received the image file 335. In response, the client 320 may send a deposit confirmation 345 to the server 322, instructing the server 322 to process the deposit of the check based on the image file 335 that had been received by the server 322.
In an implementation, the check images 458 may be received following a software call from the check processing module 454 to the image monitoring and capture module 456. In such an implementation, the image monitoring and capture module 456 may include the camera 207 contained within the mobile device 106. Alternately, the camera 207 may be detachably coupled to the mobile device 106 such as through a secure digital (SD) slot or over any suitable communications bus, such as USB (universal serial bus).
In an implementation, the image monitoring and capture module 456 may obtain one or a plurality of images to be analyzed by the check processing module 454. Check processing module 454 may identify the acceptable portions within the captured images and/or identify designated portion(s) of the image(s) to be recaptured. If check processing module 454 identifies designated portion(s) of the images to be recaptured, the check processing module 454 may provide a software call to the image monitoring and capture module 456 with instructions to recapture the designated portion(s) of the image(s). Once the acceptable portions have been identified and/or the designated portion(s) have been recaptured by the image and capture module 456, the portion(s) of the image files 315 are combined or stitched together by check processing module 454 to generate composite image file(s). Check processing module 454 may send the composite image to a financial institution (e.g., financial institution 130, the server 322, the server apparatus 570, etc.) for processing.
In an implementation, the client apparatus 450 may comprise a browser such as a web browser, for accessing a website on the Internet or other network associated with a financial institution. The user may access the website and select a “monitor and capture image” link or similar icon, button or link, for example, displayed on the browser. Such a selection may call the image monitoring and capture module 456 on the client apparatus 450.
The communications module 452 may be configured, in one example, to receive and send data signals over a suitable communications network. This may include, without limitation, GSM/GPR3, HSDPA, CDMA, TDMA, 802.11, 802.16 and the like. While the bandwidth available to the mobile device 106 may be an implementation concern such discussion is outside the scope of the present discussion and any suitable wireless communications network is considered to be within the scope of the present discussion. With respect to the present discussion, the communications module 452 may receive one or more processed check images 460 from the check processing module 454 and may transmit them over the suitable communications network to the depository 204, as described herein.
The check processing module 454 may be configured, in one example, to cause the image monitoring and capture module 456 to monitor an image of at least one side of a check provided in a field of view of the camera 207 and then capture the images after it passes monitoring criteria. Compliance with the monitoring criteria is intended to ensure that the image of the check is suitable for one or more processing tasks. For instance, if the check is rotated 45 degrees clockwise when captured, the check processing module 454 or a software object operated on the server 322 described above may be unable to optically detect information on the check. The check processing module 454 may also be configured, in one example, to cause the image monitoring and capture module 456 to recapture a designated portion of the image of a check. For instance, if an area of the check containing information necessary for processing was not captured in a manner that would allow the check processing module 454 to process the image, the check processing module 454 may cause the image monitoring and capture module 456 to recapture that portion of the check, preferably at a higher resolution or with different settings for camera 207.
The check processing module 454 may perform one or more cleaning or processing operations on the captured image of the check. Such cleaning or processing may include dewarping and/or deskewing (if not part of the monitoring criteria, in an implementation), for example. Cleaning or processing may include down-converting the image received from the image capture module to a suitable size, such as 200 dots per inch (DPI) resolution or in a resolution range such as 200 DPI to 400 DPI, 300 DPI to 500 DPI, etc., and/or converting the image to grayscale or black and white. Such operation(s) may reduce the file size of the check image. Alternatively, the check processing module 454 may send instructions to the image monitoring and capture module 456 to cause the image monitoring and capture module 456 to capture an image of the check at a suitable resolution. The check processing module 454 may additionally perform any of the following operations, in further examples: convert from JPEG to TIFF, detect check information, perform signature detection on the image of the check, and the like. The check processing module 454 may, alternatively, send the captured check image to the server described herein for such processing, and receive confirmation that the operations were completed before further operations can proceed.
The size of the file sent between the mobile device and the server may be small. This runs counter with respect to automatic check detection against a background. If captured in color, the contrast between check and background becomes easier. However, the processed image sent over the communications network may need to be smaller, and if the detection operation is performed by the server, it may be advantageous to convert the captured image to grayscale, or even black and white, before transmission to the server. Grayscale images are compliant with the Check 21 Act.
While “flat” is a fairly well known term to users, each user's appreciation of flat with respect to the camera lens of the camera 207 associated with the mobile device 106 may result in a problem with needing to align the check image programmatically or risk rejecting a large number of check images. As the image captured is a set of pixels, a tilted image will result in a jagged polygon rather than a perfect rectangle. Using convex hull algorithms, the check processing modules may create a smooth polygon around the boundary and remove the concavity of the check image. Alternatively, a rotating calipers algorithm may be used to determine the tightest fitting rectangle around the check boundary, which can then be used to determine the angle of it, with that angle being used to align the check properly.
The server apparatus 570 may include one or more software objects operating on a server operated by the depository 204. Aspects of an example server apparatus are described with respect to
The communications module 572 may be configured to receive a wireless communication from the mobile device 106 over any suitable communications network, such as those described above. The communications module 572 may additionally receive a communication over a different communications network than the mobile device 106 communicated on, such as receiving the communication over a TCP/IP (Transmission Control Protocol/Internet Protocol) connection from the user's communication provider.
The check processing module 574 may be configured, in one example, to perform one or more check processing operations on the processed image(s) 460 that are received. In an implementation, these operations may include any of the operations described herein with respect to the check processing module 454. The operation of signature verification may be performed by the check processing module 574 of the server apparatus 570 as the server apparatus 570 may interface with other systems of the depository 204 that may maintain previously verified signature samples of the user 102. Performing signature verification at the client apparatus 450 may be computationally unfeasible; additionally, there may be a security risk if the signature sample is stored on the user's own device.
A cropped grayscale image may be sent to the server apparatus 570. The server apparatus 570 may extract information via a TIFF conversion and determine the DPI and re-scale to the proper DPI (e.g., convert to TIFF and detect the DPI that was used in the grayscale image). In an implementation, DPI detection may run on the client apparatus 450.
The check clearance module 576 may be configured, in one example, to receive a file from the check processing module 574 and may communicate with a check clearinghouse such that a Check 21 compliant file may be delivered to the check clearinghouse and funds may be received by the depository 204. The availability of the funds to the user 102 may be delayed by this operation such that the user 102 only has access to those funds when the depository 204 receives confirmation that the check has cleared.
At 802, the image in the field of view of the camera may be monitored with respect to one or more monitoring criteria, such as those described above. The monitoring may be performed by the camera 207, the mobile device 106, and/or a computing device associated with the depository, for example. The monitoring may be performed pursuant to instructions received at the camera or mobile device from the deposit system operated by a depository, the server 322, or the server apparatus 570, for example. In an implementation, the results of the monitoring may indicate that the camera 207 and/or the check 108 should be repositioned and/or the light source should be adjusted prior to an image capture in order to capture an image of the check that may be processed properly, e.g., to have the data from the check obtained without error from the image, so that that check can be cleared.
At 804, when the image in the field of view passes the monitoring criteria as determined at 802, a plurality of images of the field of view may be captured by the camera. This may be accomplished through the software object accessing a camera associated with the mobile device (e.g., either comprised within the mobile device or separate from the mobile device). This may be done through an API exposed by the OS of the mobile device, or may be through software code customized for a specific phone and specific camera. With respect to the former, a developer of the software object may write code to the camera API(s), which may be specific to the OS and without regard to the camera on the device. The user may initiate the capture of the images (e.g., by pressing a button on the camera or the mobile device) or the images may be captured automatically, without user intervention, as soon as the image in the field of view is determined to have passed the monitoring criteria.
At 806, the plurality of captured images of the check may be analyzed to identify the acceptable portions within the plurality of captured images. Various portions of the captured images of the check may not be captured in a manner that can be processed for deposit. For example, certain areas of the captured images may be blurry, out of focus, overexposed, washed out, dimensions, contrast, taken in insufficient light, etc. However, other portions of the captured images may have been sufficiently captured for processing and are therefore acceptable portions within the plurality of captured images.
At 808, the acceptable portions within the plurality of images may be combined or stitched together to generate a composite image of the document. The composite image may resemble the document as a whole, such as check image 247. Alternatively, the composite image may reflect only the relevant portions of the document to processing, such as the MICR line, routing number, account number, check number, amount, payor name, payee name, signature line, endorsement signature, endorsement account number, etc. In this manner, the occurrence of non-conforming images downstream (e.g., at a depository or financial institution) is reduced, and there is a high confidence that the composite image will be properly processed downstream.
At 810, the composite image may be further processed to improve clarity of a blurry or otherwise unsatisfactory image. Any technique for processing a digital image may be used, such as post-processing image composting. Post-processing may also utilize any known image processing software or other application(s) to obtain the relevant data from the composite image.
At 812, the composite image may be transmitted to a depository, e.g. as a digital image file. At 814, the depository may receive the composite image of the check (along with financial information pertaining to the account for depositing funds, for example) and may process the composite image at step 816. Processing of the digital image file may include retrieving financial information regarding the check. The financial information may comprise the MICR number, the routing number, an amount, etc. Any known image processing technology may be used, such as edge detection, filtering to remove imagery except the check image or check data in the received digital image file, image sharpening, and technologies to distinguish between the front and the back sides of the check. The depository may identify and/or remove at least a portion of data that is extraneous to the check, such as background data.
After retrieving the financial information from the check in an electronic data representation form, the depository may determine whether the financial information such as the amount payable to the user, the account associated with the user to deposit funds, an account associated with a payor to debit funds, and an institution associated with the payor, etc., may be valid. For example, the depository may include electronic devices such as computers, servers, databases, or the like that may be in communication with each other. The electronic devices may receive an electronic data representation and may perform an analysis on the quality of the data representation, the readability of the data representation, or the like. For example, the electronic devices may determine whether the account number, amount payable, or the like may be readable such that they may be parsed and processed by the depository to credit an account associated with the user.
At 818, if the financial information is determined to be valid, the electronic data representation may be processed by the depository, thereby processing the document. At 820, the funds represented by the document are deposited in the user's account. At 822, the depository may notify the user of the deposit. If the financial information is determined to be invalid, then the user may be advised. For example, the depository may transmit an email, a web message, an instant message, or the like to the user indicating that the deposit was successful, or alternatively that the financial information associated with the electronic data representation may be invalid. The user may determine how to proceed by selecting an option on the web message, replying to the email, or the like.
Thus, in an implementation, instructions on how the user would like to proceed may be requested from the user, such as whether the user would like to try the deposit again (e.g., make another image of the check that pass the monitoring criteria and send it to the depository) or whether the user would like assistance from a representative, for example. The user may indicate how they would like to proceed. If the user would like assistance, the financial information may be transferred to a representative for further review. The representative may review the financial information associated with the electronic data representation to determine whether to allow the electronic data representation to be processed by the depository. If so, the electronic data representation of the financial information may be processed by the depository, thereby depositing the check in the user's account. The depository may send a notice to the user via email, facsimile, instant message, or mail, for example, that the check has been deposited into the selected account.
At 930, the user may send a request to deposit the check and may select an account in which to deposit the check. In an implementation, the user may select a “deposit check” option provided on the website, and may enter details such as check amount, date, the account the check funds should be deposited in, comments, etc.
At 940, a plurality of images in the field of view of the camera may be provided to and received by the institution, via the communication pathway. Still images may be provided or a video may be provided, such as a video stream generated by the camera.
At 950, the institution may analyze the plurality images or frames of the video stream to identify the acceptable portions within the plurality of captured images or frames. Various portions of the captured images of the check may not be captured in a manner that can be processed for deposit. For example, certain areas of the captured images may be blurry, out of focus, overexposed, washed out, dimensions, contrast, taken in insufficient light, etc. However, other portions of the captured images may have been sufficiently captured for processing and are therefore acceptable portions within the plurality of captured images.
At 960, the institution may combine or stitch together the acceptable portions to generate a composite image of the document. The composite image may resemble the document as a whole, such as check image 247. Alternatively, the composite image may reflect only the relevant portions of the document to processing, such as the MICR line, routing number, account number, check number, amount, payor name, payee name, signature line, endorsement signature, endorsement account number, etc. In this manner, the occurrence of non-conforming images downstream (e.g., at a depository or financial institution) is reduced, and there is a high confidence that the composite image will be properly processed downstream.
At 970, the institution may be further process the composite image to improve clarity of a blurry or otherwise unsatisfactory image. Any technique for processing a digital image may be used, such as post-processing image composting. Post-processing may also utilize any known image processing software or other application(s) to obtain the relevant data from the composite image.
At 980, the institution may process the digital images to obtain an image of the check to obtain check data. At 990, the institution processes the check data and deposits the funds of the check in the user's account, as described herein. It is contemplated that processing such as grayscale conversion, image cropping, image compression, edge and/or corner detection, etc. may be implemented in the method 900. Such operations may be performed on one or more digital images created by the camera and may be performed on the image(s) by the mobile device and/or by the institution, as described further above.
Although the examples described herein may refer to uploading of images of checks to an institution, it is contemplated that any type of document or image (e.g., vehicle accident pictures provided to an insurance company) may be processed and/or transmitted using the techniques described herein. Additionally, one or more of the techniques described herein may be performed by the institution instead of the mobile device of the user.
At 1010, the image in the field of view of the camera may be monitored with respect to one or more monitoring criteria, such as those described above. The monitoring may be performed by the camera 207, the mobile device 106, and/or a computing device associated with the depository, for example. The monitoring may be performed pursuant to instructions received at the camera or mobile device from the deposit system operated by a depository, the server 322, or the server apparatus 570, for example. In an implementation, the results of the monitoring may indicate that the camera 207 and/or the check 108 should be repositioned and/or the light source should be adjusted prior to an image capture in order to capture an image of the check that may be processed properly, e.g., to have the data from the check obtained without error from the image, so that that check can be cleared.
At 1020, when the image in the field of view passes the monitoring criteria as determined at 1010, one or more images of the field of view may be captured by the camera. This may be accomplished through the software object accessing a camera associated with the mobile device (e.g., either comprised within the mobile device or separate from the mobile device). This may be done through an API exposed by the OS of the mobile device, or may be through software code customized for a specific phone and specific camera. With respect to the former, a developer of the software object may write code to the camera API(s), which may be specific to the OS and without regard to the camera on the device. The user may initiate the capture of the images (e.g., by pressing a button on the camera or the mobile device) or the images may be captured automatically, without user intervention, as soon as the image in the field of view is determined to have passed the monitoring criteria.
At 1030, a captured image of the check may be analyzed to identify a designated portion within the captured image to be recaptured. The designated portion of the captured image of the check may not have been captured in a manner that can be processed for deposit. For example, designated portion of the captured image may be blurry, out of focus, overexposed, washed out, dimensions, contrast, taken in insufficient light, etc.
At 1040, the designated portion within the captured image is recaptured by the camera. This may be accomplished through the software object accessing a camera associated with the mobile device (e.g., either comprised within the mobile device or separate from the mobile device). This may be done through an API exposed by the OS of the mobile device, or may be through software code customized for a specific phone and specific camera. With respect to the former, a developer of the software object may write code to the camera API(s), which may be specific to the OS and without regard to the camera on the device. The user may initiate the recapture (e.g., by pressing a button on the camera or the mobile device) or the designated portion may be recaptured automatically, without user intervention, as soon as the image in the field of view is determined to have passed a new monitoring criteria.
At 1050, the captured image and the recaptured image of the designated portion may be combined or stitched together to generate a composite image of the document. The composite image may resemble the document as a whole, such as check image 247. Alternatively, the composite image may reflect only the relevant portions of the document to processing, such as the MICR line, routing number, account number, check number, amount, payor name, payee name, signature line, endorsement signature, endorsement account number, etc. In this manner, the occurrence of non-conforming images downstream (e.g., at a depository or financial institution) is reduced, and there is a high confidence that the composite image will be properly processed downstream.
At 1060, the composite image may be transmitted to a depository, e.g. as a digital image file. As described herein, at 1070, the depository may receive the image of the check (along with financial information pertaining to the account for depositing funds, for example) and may process the image. Processing of the digital image file may include retrieving financial information regarding the check. The financial information may comprise the MICR number, the routing number, an amount, etc. Any known image processing technology may be used, such as edge detection, filtering to remove imagery except the check image or check data in the received digital image file, image sharpening, and technologies to distinguish between the front and the back sides of the check. The depository may identify and/or remove at least a portion of data that is extraneous to the check, such as background data.
After retrieving the financial information from the check in an electronic data representation form, the depository may determine whether the financial information such as the amount payable to the user, the account associated with the user to deposit funds, an account associated with a payor to debit funds, and an institution associated with the payor, etc., may be valid. For example, the depository may include electronic devices such as computers, servers, databases, or the like that may be in communication with each other. The electronic devices may receive an electronic data representation and may perform an analysis on the quality of the data representation, the readability of the data representation, or the like. For example, the electronic devices may determine whether the account number, amount payable, or the like may be readable such that they may be parsed and processed by the depository to credit an account associated with the user.
If the financial information is determined to be valid, the electronic data representation may be processed by the depository, thereby depositing the money in the user's account. If the financial information is determined to be invalid, then the user may be advised. For example, the depository may transmit an email, a web message, an instant message, or the like to the user indicating that the financial information associated with the electronic data representation may be invalid. The user may determine how to proceed by selecting an option on the web message, replying to the email, or the like.
Thus, in an implementation, instructions on how the user would like to proceed may be requested from the user, such as whether the user would like to try the deposit again (e.g., make another image of the check that pass the monitoring criteria and send it to the depository) or whether the user would like assistance from a representative, for example. The user may indicate how they would like to proceed. If the user would like assistance, the financial information may be transferred to a representative for further review. The representative may review the financial information associated with the electronic data representation to determine whether to allow the electronic data representation to be processed by the depository. If so, the electronic data representation of the financial information may be processed by the depository, thereby depositing the check in the user's account. The depository may send a notice to the user via email, facsimile, instant message, or mail, for example, that the check has been deposited into the selected account.
In another implementation, the above described implementations may be combined. For example, when the image of the check 108 in the field of view passes the monitoring criteria, a plurality of images may be automatically taken by the camera of the same side of check 108. The plurality of images may be analyzed to identify acceptable portions within the plurality of captured images. It may be determined that there are insufficient acceptable portions within the plurality of captured images for processing. The plurality of images may then be analyzed to identify a designated portion within a captured image check 108 to be recaptured by the camera. User 102 may be instructed to recapture the designated portion of check 108, preferably at a higher resolution. The acceptable portions within the captured images and the recaptured designated portion may be combined or stitched together by mobile device 106 to generate a composite image. The composite image may be provided from the mobile device 106 to a financial institution. This combination is merely exemplary and other combinations of implementations are contemplated.
In an implementation, mobile device 106, server 322 and server apparatus 570 may be implemented using a computer and computing environment.
Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data may be located in both local and remote computer storage media including memory storage devices.
With reference to
The storage device 1122 represents one or more mechanisms for storing data. For example, the storage device 1122 may include read-only memory (ROM), RAM, magnetic disk storage media, optical storage media, flash memory devices, and/or other machine-readable media. In other embodiments, any appropriate type of storage device may be used. Although only one storage device 1122 is shown, multiple storage devices and multiple types of storage devices may be present. Further, although the computer 1110 is drawn to contain the storage device 1122, it may be distributed across other computers, for example on a server.
The storage device 1122 includes a controller (not shown in
The output device 1124 is that part of the computer 1110 that displays output to the user. The output device 1124 may be a liquid crystal display (LCD) well-known in the art of computer hardware. In other embodiments, the output device 1124 may be replaced with a gas or plasma-based flat-panel display or a traditional cathode-ray tube (CRT) display. In still other embodiments, any appropriate display device may be used. Although only one output device 1124 is shown, in other embodiments any number of output devices of different types, or of the same type, may be present. In an embodiment, the output device 1124 displays a user interface. The input device 1126 may be a keyboard, mouse or other pointing device, trackball, touchpad, touch screen, keypad, microphone, voice recognition device, or any other appropriate mechanism for the user to input data to the computer 1110 and manipulate the user interface previously discussed. Although only one input device 1126 is shown, in another embodiment any number and type of input devices may be present.
The network interface device 1128 provides connectivity from the computer 1110 to the network 1114 through any suitable communications protocol. The network interface device 1128 sends and receives data items from the network 1114. The bus 1130 may represent one or more busses, e.g., USB, PCI, ISA (Industry Standard Architecture), X-Bus, EISA (Extended Industry Standard Architecture), or any other appropriate bus and/or bridge (also called a bus controller).
The computer 1110 may be implemented using any suitable hardware and/or software, such as a personal computer or other electronic computing device. Portable computers, laptop or notebook computers, PDAs, pocket computers, appliances, telephones, and mainframe computers are examples of other possible configurations of the computer 1110. For example, other peripheral devices such as audio adapters or chip programming devices, such as EPROM (Erasable Programmable Read-Only Memory) programming devices may be used in addition to, or in place of, the hardware already depicted.
The network 1114 may be any suitable network and may support any appropriate protocol suitable for communication to the computer 1110. In an embodiment, the network 1114 may support wireless communications. In another embodiment, the network 1114 may support hard-wired communications, such as a telephone line or cable. In another embodiment, the network 1114 may support the Ethernet IEEE (Institute of Electrical and Electronics Engineers) 802.3x specification. In another embodiment, the network 1114 may be the Internet and may support IP (Internet Protocol). In another embodiment, the network 1114 may be a LAN or a WAN. In another embodiment, the network 1114 may be a hotspot service provider network. In another embodiment, the network 1114 may be an intranet. In another embodiment, the network 1114 may be a GPRS (General Packet Radio Service) network. In another embodiment, the network 1114 may be any appropriate cellular data network or cell-based radio network technology. In another embodiment, the network 1114 may be an IEEE 802.11 wireless network. In still another embodiment, the network 1114 may be any suitable network or combination of networks. Although one network 1114 is shown, in other embodiments any number of networks (of the same or different types) may be present.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or use the processes described in connection with the presently disclosed subject matter, e.g., through the use of an API, reusable controls, or the like. Such programs may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.
Although exemplary embodiments may refer to using aspects of the presently disclosed subject matter in the context of one or more stand-alone computer systems, the subject matter is not so limited, but rather may be implemented in connection with any computing environment, such as a network or distributed computing environment. Still further, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and storage may similarly be effected across a plurality of devices. Such devices might include personal computers, network servers, and handheld devices, for example. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
This application claims the benefit of U.S. Provisional Patent Application No. 62/167,754, filed May 28, 2016, the entirety of which is hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
1748489 | McCarthy et al. | Feb 1930 | A |
2292825 | Dilks et al. | Aug 1942 | A |
3005282 | Christiansen | Oct 1961 | A |
3341820 | Grillmeier, Jr. et al. | Sep 1967 | A |
3576972 | Wood | May 1971 | A |
3593913 | Bremer | Jul 1971 | A |
3620553 | Donovan | Nov 1971 | A |
3648242 | Grosbard | Mar 1972 | A |
3800124 | Walsh | Mar 1974 | A |
3816943 | Henry | Jun 1974 | A |
4002356 | Weidmann | Jan 1977 | A |
4027142 | Paup et al. | May 1977 | A |
4060711 | Buros | Nov 1977 | A |
4070649 | Wright, Jr. et al. | Jan 1978 | A |
4128202 | Buros | Dec 1978 | A |
4136471 | Austin | Jan 1979 | A |
4205780 | Burns | Jun 1980 | A |
4264808 | Owens | Apr 1981 | A |
4305216 | Skelton | Dec 1981 | A |
4321672 | Braun | Mar 1982 | A |
4346442 | Musmanno | Aug 1982 | A |
4417136 | Rushby et al. | Nov 1983 | A |
4433436 | Carnes | Feb 1984 | A |
4454610 | Sziklai | Jun 1984 | A |
RE31692 | Tyburski et al. | Oct 1984 | E |
4523330 | Cain | Jun 1985 | A |
4636099 | Goldston | Jan 1987 | A |
4640413 | Kaplan | Feb 1987 | A |
4644144 | Chandek | Feb 1987 | A |
4722444 | Murphy et al. | Feb 1988 | A |
4722544 | Weber | Feb 1988 | A |
4727435 | Otani | Feb 1988 | A |
4737911 | Freeman | Apr 1988 | A |
4739411 | Bolton | Apr 1988 | A |
4774574 | Daly et al. | Sep 1988 | A |
4774663 | Musmanno | Sep 1988 | A |
4790475 | Griffin | Dec 1988 | A |
4806780 | Yamamoto | Feb 1989 | A |
4837693 | Schotz | Jun 1989 | A |
4890228 | Longfield | Dec 1989 | A |
4927071 | Wood | May 1990 | A |
4934587 | McNabb | Jun 1990 | A |
4960981 | Benton | Oct 1990 | A |
4975735 | Bright | Dec 1990 | A |
5022683 | Barbour | Jun 1991 | A |
5053607 | Carlson | Oct 1991 | A |
5077805 | Tan | Dec 1991 | A |
5091968 | Higgins et al. | Feb 1992 | A |
5122950 | Benton et al. | Jun 1992 | A |
5134564 | Dunn et al. | Jul 1992 | A |
5146606 | Grondalski | Sep 1992 | A |
5157620 | Shaar | Oct 1992 | A |
5159548 | Caslavka | Oct 1992 | A |
5164833 | Aoki | Nov 1992 | A |
5175682 | Higashiyama et al. | Dec 1992 | A |
5187750 | Behera | Feb 1993 | A |
5191525 | LeBrun | Mar 1993 | A |
5193121 | Elischer et al. | Mar 1993 | A |
5220501 | Lawlor | Jun 1993 | A |
5227863 | Bilbrey et al. | Jul 1993 | A |
5229589 | Schneider | Jul 1993 | A |
5233547 | Kapp et al. | Aug 1993 | A |
5237158 | Kern et al. | Aug 1993 | A |
5237159 | Stephens | Aug 1993 | A |
5237620 | Deaton et al. | Aug 1993 | A |
5257320 | Etherington et al. | Oct 1993 | A |
5265008 | Benton | Nov 1993 | A |
5268968 | Yoshida | Dec 1993 | A |
5283829 | Anderson | Feb 1994 | A |
5321816 | Rogan | Jun 1994 | A |
5345090 | Hludzinski | Sep 1994 | A |
5347302 | Simonoff | Sep 1994 | A |
5350906 | Brody | Sep 1994 | A |
5373550 | Campbell | Dec 1994 | A |
5383113 | Kight et al. | Jan 1995 | A |
5419588 | Wood | May 1995 | A |
5422467 | Graef | Jun 1995 | A |
5444616 | Nair et al. | Aug 1995 | A |
5444794 | Uhland, Sr. | Aug 1995 | A |
5455875 | Chevion et al. | Oct 1995 | A |
5475403 | Havlovick et al. | Dec 1995 | A |
5504538 | Tsujihara | Apr 1996 | A |
5504677 | Pollin | Apr 1996 | A |
5528387 | Kelly et al. | Jun 1996 | A |
5530773 | Thompson | Jun 1996 | A |
5577179 | Blank | Nov 1996 | A |
5583759 | Geer | Dec 1996 | A |
5590196 | Moreau | Dec 1996 | A |
5594225 | Botvin | Jan 1997 | A |
5598969 | Ong | Feb 1997 | A |
5602936 | Green | Feb 1997 | A |
5610726 | Nonoshita | Mar 1997 | A |
5611028 | Shibasaki | Mar 1997 | A |
5630073 | Nolan | May 1997 | A |
5631984 | Graf et al. | May 1997 | A |
5668897 | Stolfo | Sep 1997 | A |
5673320 | Ray et al. | Sep 1997 | A |
5677955 | Doggett | Oct 1997 | A |
5678046 | Cahill et al. | Oct 1997 | A |
5679938 | Templeton | Oct 1997 | A |
5680611 | Rail | Oct 1997 | A |
5691524 | Josephson | Nov 1997 | A |
5699452 | Vaidyanathan | Dec 1997 | A |
5734747 | Vaidyanathan | Mar 1998 | A |
5737440 | Kunkler | Apr 1998 | A |
5748780 | Stolfo | May 1998 | A |
5751842 | Riach | May 1998 | A |
5761686 | Bloomberg | Jun 1998 | A |
5784503 | Bleecker, III et al. | Jul 1998 | A |
5830609 | Warner | Nov 1998 | A |
5832463 | Funk | Nov 1998 | A |
5838814 | Moore | Nov 1998 | A |
5848185 | Koga et al. | Dec 1998 | A |
5859935 | Johnson et al. | Jan 1999 | A |
5863075 | Rich | Jan 1999 | A |
5870456 | Rogers | Feb 1999 | A |
5870724 | Lawlor | Feb 1999 | A |
5870725 | Bellinger et al. | Feb 1999 | A |
5878337 | Joao | Mar 1999 | A |
5889884 | Hashimoto et al. | Mar 1999 | A |
5893101 | Balogh et al. | Apr 1999 | A |
5897625 | Gustin | Apr 1999 | A |
5898157 | Mangili et al. | Apr 1999 | A |
5901253 | Tretter | May 1999 | A |
5903878 | Talati | May 1999 | A |
5903881 | Schrader | May 1999 | A |
5910988 | Ballard | Jun 1999 | A |
5917931 | Kunkler | Jun 1999 | A |
5924737 | Schrupp | Jul 1999 | A |
5926548 | Okamoto | Jul 1999 | A |
5930778 | Geer | Jul 1999 | A |
5937396 | Konya | Aug 1999 | A |
5940844 | Cahill | Aug 1999 | A |
5982918 | Mennie | Nov 1999 | A |
5987439 | Gustin et al. | Nov 1999 | A |
6012048 | Gustin et al. | Jan 2000 | A |
6014454 | Kunkler | Jan 2000 | A |
6021202 | Anderson | Feb 2000 | A |
6021397 | Jones | Feb 2000 | A |
6023705 | Bellinger et al. | Feb 2000 | A |
6029887 | Furuhashi | Feb 2000 | A |
6030000 | Diamond | Feb 2000 | A |
6032137 | Ballard | Feb 2000 | A |
6038553 | Hyde | Mar 2000 | A |
6053405 | Irwin, Jr. et al. | Apr 2000 | A |
6059185 | Funk et al. | May 2000 | A |
6064762 | Haenel | May 2000 | A |
6072941 | Suzuki et al. | Jun 2000 | A |
6073119 | Borenmisza-Wahr | Jun 2000 | A |
6073121 | Ramzy | Jun 2000 | A |
6085168 | Mori | Jul 2000 | A |
6086708 | Colgate | Jul 2000 | A |
6089450 | Koeple | Jul 2000 | A |
6089610 | Greene | Jul 2000 | A |
6092047 | Hyman et al. | Jul 2000 | A |
6097834 | Krouse | Aug 2000 | A |
6097845 | Ng et al. | Aug 2000 | A |
6097885 | Rayner | Aug 2000 | A |
6105865 | Hardesty | Aug 2000 | A |
6128603 | Dent et al. | Oct 2000 | A |
6141339 | Kaplan et al. | Oct 2000 | A |
6145738 | Stinson et al. | Nov 2000 | A |
6148102 | Stolin | Nov 2000 | A |
6149056 | Stinson et al. | Nov 2000 | A |
6151409 | Chen et al. | Nov 2000 | A |
6151423 | Melen | Nov 2000 | A |
6151426 | Lee | Nov 2000 | A |
6159585 | Rittenhouse | Dec 2000 | A |
6170744 | Lee | Jan 2001 | B1 |
6178409 | Weber et al. | Jan 2001 | B1 |
6181837 | Cahill et al. | Jan 2001 | B1 |
6188506 | Kaiserman | Feb 2001 | B1 |
6189785 | Lowery | Feb 2001 | B1 |
6192165 | Irons | Feb 2001 | B1 |
6195694 | Chen et al. | Feb 2001 | B1 |
6199055 | Kara | Mar 2001 | B1 |
6236009 | Emigh et al. | May 2001 | B1 |
6243689 | Norton | Jun 2001 | B1 |
6278983 | Ball | Aug 2001 | B1 |
6282523 | Tedesco et al. | Aug 2001 | B1 |
6282826 | Richards | Sep 2001 | B1 |
6293469 | Masson et al. | Sep 2001 | B1 |
6304860 | Martin | Oct 2001 | B1 |
6310647 | Parulski et al. | Oct 2001 | B1 |
6314452 | Dekel | Nov 2001 | B1 |
6317727 | May | Nov 2001 | B1 |
6328207 | Gregoire et al. | Dec 2001 | B1 |
6330546 | Gopinathan et al. | Dec 2001 | B1 |
6339658 | Moccagatta | Jan 2002 | B1 |
6339766 | Gephart | Jan 2002 | B1 |
6351553 | Hayosh | Feb 2002 | B1 |
6351735 | Deaton et al. | Feb 2002 | B1 |
6354490 | Weiss et al. | Mar 2002 | B1 |
6363162 | Moed et al. | Mar 2002 | B1 |
6363164 | Jones et al. | Mar 2002 | B1 |
6390362 | Martin | May 2002 | B1 |
6397196 | Kravetz | May 2002 | B1 |
6408084 | Foley | Jun 2002 | B1 |
6411725 | Rhoads | Jun 2002 | B1 |
6411737 | Wesolkowski et al. | Jun 2002 | B2 |
6411938 | Gates et al. | Jun 2002 | B1 |
6413305 | Mehta | Jul 2002 | B1 |
6417869 | Do | Jul 2002 | B1 |
6425017 | Dievendorff | Jul 2002 | B1 |
6429952 | Olbricht | Aug 2002 | B1 |
6439454 | Masson et al. | Aug 2002 | B1 |
6449397 | Che-Chu | Sep 2002 | B1 |
6450403 | Martens et al. | Sep 2002 | B1 |
6463220 | Dance et al. | Oct 2002 | B1 |
6464134 | Page | Oct 2002 | B1 |
6469745 | Yamada et al. | Oct 2002 | B1 |
6470325 | Leemhuis | Oct 2002 | B1 |
6473519 | Pidhirny et al. | Oct 2002 | B1 |
6502747 | Stoutenburg et al. | Jan 2003 | B1 |
6505178 | Flenley | Jan 2003 | B1 |
6546119 | Ciolli et al. | Apr 2003 | B2 |
6574377 | Cahill et al. | Jun 2003 | B1 |
6574609 | Downs | Jun 2003 | B1 |
6578760 | Otto | Jun 2003 | B1 |
6587837 | Spagna | Jul 2003 | B1 |
6606117 | Windle | Aug 2003 | B1 |
6609200 | Anderson | Aug 2003 | B2 |
6611598 | Hayosh | Aug 2003 | B1 |
6614930 | Agnihotri et al. | Sep 2003 | B1 |
6643416 | Daniels | Nov 2003 | B1 |
6647136 | Jones et al. | Nov 2003 | B2 |
6654487 | Downs, Jr. | Nov 2003 | B1 |
6661910 | Jones et al. | Dec 2003 | B2 |
6669086 | Abdi et al. | Dec 2003 | B2 |
6672452 | Alves | Jan 2004 | B1 |
6682452 | Quintus | Jan 2004 | B2 |
6695204 | Stinson | Feb 2004 | B1 |
6711474 | Treyz et al. | Mar 2004 | B1 |
6726097 | Graef | Apr 2004 | B2 |
6728397 | McNeal | Apr 2004 | B2 |
6738496 | Van Hall | May 2004 | B1 |
6742128 | Joiner | May 2004 | B1 |
6745186 | Testa et al. | Jun 2004 | B1 |
6754640 | Bozeman | Jun 2004 | B2 |
6755340 | Voss et al. | Jun 2004 | B1 |
6760414 | Schurko et al. | Jul 2004 | B1 |
6760470 | Bogosian et al. | Jul 2004 | B1 |
6763226 | McZeal | Jul 2004 | B1 |
6781962 | Williams | Aug 2004 | B1 |
6786398 | Stinson et al. | Sep 2004 | B1 |
6789054 | Makhlouf | Sep 2004 | B1 |
6796489 | Slater et al. | Sep 2004 | B2 |
6796491 | Nakajima | Sep 2004 | B2 |
6806903 | Okisu et al. | Oct 2004 | B1 |
6807294 | Yamazaki | Oct 2004 | B2 |
6813733 | Li | Nov 2004 | B1 |
6829704 | Zhang | Dec 2004 | B2 |
6844885 | Anderson | Jan 2005 | B2 |
6856965 | Stinson | Feb 2005 | B1 |
6863214 | Garner et al. | Mar 2005 | B2 |
6870947 | Kelland | Mar 2005 | B2 |
6873728 | Bernstein et al. | Mar 2005 | B2 |
6883140 | Acker | Apr 2005 | B1 |
6898314 | Kung et al. | May 2005 | B2 |
6902105 | Koakutsu | Jun 2005 | B2 |
6910023 | Schibi | Jun 2005 | B1 |
6913188 | Wong | Jul 2005 | B2 |
6931255 | Mekuria | Aug 2005 | B2 |
6931591 | Brown | Aug 2005 | B1 |
6934719 | Nally | Aug 2005 | B2 |
6947610 | Sun | Sep 2005 | B2 |
6957770 | Robinson | Oct 2005 | B1 |
6961689 | Greenberg | Nov 2005 | B1 |
6970843 | Forte | Nov 2005 | B1 |
6973589 | Wright | Dec 2005 | B2 |
6983886 | Natsukari et al. | Jan 2006 | B2 |
6993507 | Meyer | Jan 2006 | B2 |
6996263 | Jones et al. | Feb 2006 | B2 |
6999943 | Johnson | Feb 2006 | B1 |
7003040 | Yi | Feb 2006 | B2 |
7004382 | Sandru | Feb 2006 | B2 |
7010155 | Koakutsu et al. | Mar 2006 | B2 |
7010507 | Anderson | Mar 2006 | B1 |
7016704 | Pallakoff | Mar 2006 | B2 |
7027171 | Watanabe | Apr 2006 | B1 |
7028886 | Maloney | Apr 2006 | B1 |
7039048 | Monta | May 2006 | B1 |
7046991 | Little | May 2006 | B2 |
7051001 | Slater | May 2006 | B1 |
7058036 | Yu | Jun 2006 | B1 |
7062099 | Li et al. | Jun 2006 | B2 |
7062456 | Riehl et al. | Jun 2006 | B1 |
7062768 | Kubo | Jun 2006 | B2 |
7072862 | Wilson | Jul 2006 | B1 |
7076458 | Lawlor et al. | Jul 2006 | B2 |
7086003 | Demsky | Aug 2006 | B2 |
7092561 | Downs, Jr. | Aug 2006 | B2 |
7104443 | Paul et al. | Sep 2006 | B1 |
7113925 | Waserstein | Sep 2006 | B2 |
7114649 | Nelson | Oct 2006 | B2 |
7116446 | Maurer | Oct 2006 | B2 |
7117171 | Pollin | Oct 2006 | B1 |
7120461 | Cho | Oct 2006 | B2 |
7131571 | Swift et al. | Nov 2006 | B2 |
7139594 | Nagatomo | Nov 2006 | B2 |
7140539 | Crews | Nov 2006 | B1 |
7163347 | Lugg | Jan 2007 | B2 |
7178721 | Maloney | Feb 2007 | B2 |
7181430 | Buchanan et al. | Feb 2007 | B1 |
7184980 | Allen-Rouman et al. | Feb 2007 | B2 |
7185805 | McShirley | Mar 2007 | B1 |
7197173 | Jones et al. | Mar 2007 | B2 |
7200255 | Jones | Apr 2007 | B2 |
7204412 | Foss, Jr. | Apr 2007 | B2 |
7216106 | Buchanan | May 2007 | B1 |
7219082 | Forte | May 2007 | B2 |
7219831 | Murata | May 2007 | B2 |
7245765 | Myers et al. | Jul 2007 | B2 |
7249076 | Pendleton | Jul 2007 | B1 |
7252224 | Verma | Aug 2007 | B2 |
7257246 | Brodie et al. | Aug 2007 | B1 |
7266230 | Doran | Sep 2007 | B2 |
7277191 | Metcalfe et al. | Oct 2007 | B2 |
7290034 | Budd | Oct 2007 | B2 |
7299970 | Ching | Nov 2007 | B1 |
7299979 | Phillips | Nov 2007 | B2 |
7313543 | Crane | Dec 2007 | B1 |
7314163 | Crews et al. | Jan 2008 | B1 |
7321874 | Dilip | Jan 2008 | B2 |
7321875 | Dilip | Jan 2008 | B2 |
7325725 | Foss, Jr. | Feb 2008 | B2 |
7328190 | Smith et al. | Feb 2008 | B2 |
7330604 | Wu et al. | Feb 2008 | B2 |
7331523 | Meier et al. | Feb 2008 | B2 |
7336813 | Prakash et al. | Feb 2008 | B2 |
7343320 | Treyz | Mar 2008 | B1 |
7349566 | Jones et al. | Mar 2008 | B2 |
7349585 | Li | Mar 2008 | B2 |
7356505 | March | Apr 2008 | B2 |
7369713 | Suino | May 2008 | B2 |
7377425 | Ma | May 2008 | B1 |
7379978 | Anderson | May 2008 | B2 |
7383227 | Weinflash et al. | Jun 2008 | B2 |
7385631 | Maeno | Jun 2008 | B2 |
7386511 | Buchanan | Jun 2008 | B2 |
7388683 | Rodriguez et al. | Jun 2008 | B2 |
7391897 | Jones et al. | Jun 2008 | B2 |
7391934 | Goodall et al. | Jun 2008 | B2 |
7392935 | Byrne | Jul 2008 | B2 |
7401048 | Rosedale | Jul 2008 | B2 |
7403917 | Larsen | Jul 2008 | B1 |
7406198 | Aoki et al. | Jul 2008 | B2 |
7419093 | Blackson et al. | Sep 2008 | B1 |
7421107 | Lugg | Sep 2008 | B2 |
7421410 | Schechtman et al. | Sep 2008 | B1 |
7427016 | Chimento | Sep 2008 | B2 |
7433098 | Klein et al. | Oct 2008 | B2 |
7437327 | Lam | Oct 2008 | B2 |
7440924 | Buchanan | Oct 2008 | B2 |
7447347 | Weber | Nov 2008 | B2 |
7455220 | Phillips | Nov 2008 | B2 |
7455221 | Sheaffer | Nov 2008 | B2 |
7460108 | Tamura | Dec 2008 | B2 |
7460700 | Tsunachima et al. | Dec 2008 | B2 |
7461779 | Ramachandran | Dec 2008 | B2 |
7461780 | Potts | Dec 2008 | B2 |
7464859 | Hawkins | Dec 2008 | B1 |
7471818 | Price | Dec 2008 | B1 |
7475040 | Buchanan | Jan 2009 | B2 |
7477923 | Wallmark | Jan 2009 | B2 |
7480382 | Dunbar | Jan 2009 | B2 |
7480422 | Ackley et al. | Jan 2009 | B2 |
7489953 | Griffin | Feb 2009 | B2 |
7490242 | Torres | Feb 2009 | B2 |
7497429 | Reynders | Mar 2009 | B2 |
7503486 | Ahles | Mar 2009 | B2 |
7505759 | Rahman | Mar 2009 | B1 |
7506261 | Statou | Mar 2009 | B2 |
7509287 | Nutahara | Mar 2009 | B2 |
7512564 | Geer | Mar 2009 | B1 |
7519560 | Lam | Apr 2009 | B2 |
7520420 | Phillips | Apr 2009 | B2 |
7520422 | Robinson et al. | Apr 2009 | B1 |
7536354 | deGroeve et al. | May 2009 | B1 |
7536440 | Budd | May 2009 | B2 |
7539646 | Gilder | May 2009 | B2 |
7540408 | Levine | Jun 2009 | B2 |
7542598 | Jones | Jun 2009 | B2 |
7545529 | Borrey et al. | Jun 2009 | B2 |
7548641 | Gilson et al. | Jun 2009 | B2 |
7566002 | Love et al. | Jul 2009 | B2 |
7571848 | Cohen | Aug 2009 | B2 |
7577614 | Warren et al. | Aug 2009 | B1 |
7587066 | Cordery et al. | Sep 2009 | B2 |
7587363 | Cataline | Sep 2009 | B2 |
7590275 | Clarke et al. | Sep 2009 | B2 |
7599543 | Jones | Oct 2009 | B2 |
7599888 | Manfre | Oct 2009 | B2 |
7602956 | Jones | Oct 2009 | B2 |
7606762 | Heit | Oct 2009 | B1 |
7609873 | Foth et al. | Oct 2009 | B2 |
7609889 | Guo et al. | Oct 2009 | B2 |
7619721 | Jones | Nov 2009 | B2 |
7620231 | Jones | Nov 2009 | B2 |
7620604 | Bueche, Jr. | Nov 2009 | B1 |
7630518 | Frew et al. | Dec 2009 | B2 |
7644037 | Ostrovsky | Jan 2010 | B1 |
7644043 | Minowa | Jan 2010 | B2 |
7647275 | Jones | Jan 2010 | B2 |
7668363 | Price | Feb 2010 | B2 |
7672022 | Fan | Mar 2010 | B1 |
7672940 | Viola | Mar 2010 | B2 |
7676409 | Ahmad | Mar 2010 | B1 |
7680732 | Davies et al. | Mar 2010 | B1 |
7680735 | Loy | Mar 2010 | B1 |
7689482 | Lam | Mar 2010 | B2 |
7697776 | Wu et al. | Apr 2010 | B2 |
7698222 | Bueche, Jr. | Apr 2010 | B1 |
7702588 | Gilder et al. | Apr 2010 | B2 |
7714778 | Dupray | May 2010 | B2 |
7720735 | Anderson et al. | May 2010 | B2 |
7734545 | Fogliano | Jun 2010 | B1 |
7743979 | Fredman | Jun 2010 | B2 |
7753268 | Robinson et al. | Jul 2010 | B1 |
7761358 | Craig et al. | Jul 2010 | B2 |
7766244 | Field | Aug 2010 | B1 |
7769650 | Bleunven | Aug 2010 | B2 |
7778457 | Nepomniachtchi et al. | Aug 2010 | B2 |
7792752 | Kay | Sep 2010 | B1 |
7792753 | Slater et al. | Sep 2010 | B1 |
7810714 | Murata | Oct 2010 | B2 |
7812986 | Graham et al. | Oct 2010 | B2 |
7818245 | Prakash et al. | Oct 2010 | B2 |
7831458 | Neumann | Nov 2010 | B2 |
7856402 | Kay | Dec 2010 | B1 |
7865384 | Anderson et al. | Jan 2011 | B2 |
7873200 | Oakes, III et al. | Jan 2011 | B1 |
7876949 | Oakes, III et al. | Jan 2011 | B1 |
7885451 | Walls et al. | Feb 2011 | B1 |
7885880 | Prasad et al. | Feb 2011 | B1 |
7894094 | Nacman et al. | Feb 2011 | B2 |
7895054 | Slen et al. | Feb 2011 | B2 |
7896232 | Prasad et al. | Mar 2011 | B1 |
7900822 | Prasad et al. | Mar 2011 | B1 |
7903863 | Jones et al. | Mar 2011 | B2 |
7904386 | Kalra et al. | Mar 2011 | B2 |
7912785 | Kay | Mar 2011 | B1 |
7935441 | Tononishi | May 2011 | B2 |
7949587 | Morris et al. | May 2011 | B1 |
7950698 | Popadic et al. | May 2011 | B2 |
7953441 | Lors | May 2011 | B2 |
7958053 | Stone | Jun 2011 | B2 |
7962411 | Prasad et al. | Jun 2011 | B1 |
7970677 | Oakes, III et al. | Jun 2011 | B1 |
7974899 | Prasad et al. | Jul 2011 | B1 |
7978900 | Nepomniachtchi et al. | Jul 2011 | B2 |
7979326 | Kurushima | Jul 2011 | B2 |
7996312 | Beck et al. | Aug 2011 | B1 |
7996314 | Smith et al. | Aug 2011 | B1 |
7996315 | Smith et al. | Aug 2011 | B1 |
7996316 | Smith et al. | Aug 2011 | B1 |
8001051 | Smith et al. | Aug 2011 | B1 |
8045784 | Price et al. | Oct 2011 | B2 |
8046301 | Smith et al. | Oct 2011 | B1 |
8060442 | Hecht et al. | Nov 2011 | B1 |
8065307 | Haslam et al. | Nov 2011 | B2 |
8091778 | Block et al. | Jan 2012 | B1 |
8116533 | Kiplinger et al. | Feb 2012 | B2 |
8159520 | Dhanoa | Apr 2012 | B1 |
8203640 | Kim et al. | Jun 2012 | B2 |
8204293 | Csulits et al. | Jun 2012 | B2 |
8235284 | Prasad et al. | Aug 2012 | B1 |
8266076 | Lopez et al. | Sep 2012 | B2 |
8271385 | Emerson et al. | Sep 2012 | B2 |
8290237 | Burks et al. | Oct 2012 | B1 |
8320657 | Burks et al. | Nov 2012 | B1 |
8332329 | Thiele | Dec 2012 | B1 |
8351677 | Oakes, III et al. | Jan 2013 | B1 |
8351678 | Medina, III | Jan 2013 | B1 |
8358826 | Medina et al. | Jan 2013 | B1 |
8364563 | Choiniere, Sr. | Jan 2013 | B2 |
8369650 | Zanfir et al. | Feb 2013 | B2 |
8374963 | Billman | Feb 2013 | B1 |
8391599 | Medina, III | Mar 2013 | B1 |
8392332 | Oakes, III et al. | Mar 2013 | B1 |
8401962 | Bent et al. | Mar 2013 | B1 |
8422758 | Bueche, Jr. | Apr 2013 | B1 |
8433127 | Harpel et al. | Apr 2013 | B1 |
8433647 | Yarbrough | Apr 2013 | B1 |
8452689 | Medina, III | May 2013 | B1 |
8464933 | Prasad et al. | Jun 2013 | B1 |
8538124 | Harpel et al. | Sep 2013 | B1 |
8542921 | Medina | Sep 2013 | B1 |
8548267 | Yacoub et al. | Oct 2013 | B1 |
8559766 | Tilt et al. | Oct 2013 | B2 |
8582862 | Nepomniachtchi et al. | Nov 2013 | B2 |
8611635 | Medina, III | Dec 2013 | B1 |
8660952 | Viera et al. | Feb 2014 | B1 |
8699779 | Prasad et al. | Apr 2014 | B1 |
8708227 | Oakes, III et al. | Apr 2014 | B1 |
8731321 | Fujiwara et al. | May 2014 | B2 |
8732081 | Oakes, III et al. | May 2014 | B1 |
8751345 | Borzyche et al. | Jun 2014 | B1 |
8751356 | Garcia | Jun 2014 | B1 |
8751379 | Bueche, Jr. | Jun 2014 | B1 |
8799147 | Walls et al. | Aug 2014 | B1 |
8837806 | Ethington et al. | Sep 2014 | B1 |
8843405 | Hartman et al. | Sep 2014 | B1 |
8959033 | Oakes, III et al. | Feb 2015 | B1 |
8977571 | Bueche, Jr. et al. | Mar 2015 | B1 |
8990862 | Smith | Mar 2015 | B1 |
9009071 | Watson et al. | Apr 2015 | B1 |
9036040 | Danko | May 2015 | B1 |
9058512 | Medina, III | Jun 2015 | B1 |
9064284 | Janiszeski et al. | Jun 2015 | B1 |
9129340 | Medina, III et al. | Aug 2015 | B1 |
9159101 | Pollack et al. | Oct 2015 | B1 |
9177197 | Prasad et al. | Nov 2015 | B1 |
9177198 | Prasad et al. | Nov 2015 | B1 |
9224136 | Oakes, III et al. | Dec 2015 | B1 |
9286514 | Newman | Mar 2016 | B1 |
9311634 | Hildebrand | Apr 2016 | B1 |
9336517 | Prasad et al. | May 2016 | B1 |
9390339 | Danko | Jul 2016 | B1 |
9401011 | Medina, III et al. | Jul 2016 | B2 |
9424569 | Sherman et al. | Aug 2016 | B1 |
9569756 | Bueche, Jr. et al. | Feb 2017 | B1 |
9619872 | Medina, III et al. | Apr 2017 | B1 |
9626183 | Smith et al. | Apr 2017 | B1 |
9626662 | Prasad et al. | Apr 2017 | B1 |
9779392 | Prasad et al. | Oct 2017 | B1 |
9779452 | Medina et al. | Oct 2017 | B1 |
9785929 | Watson et al. | Oct 2017 | B1 |
9792654 | Limas et al. | Oct 2017 | B1 |
9818090 | Bueche, Jr. et al. | Nov 2017 | B1 |
9886642 | Danko | Feb 2018 | B1 |
9892454 | Pollack et al. | Feb 2018 | B1 |
9898778 | Pollack et al. | Feb 2018 | B1 |
9898808 | Medina, III et al. | Feb 2018 | B1 |
9904848 | Newman | Feb 2018 | B1 |
9946923 | Medina | Apr 2018 | B1 |
10013605 | Oakes, III et al. | Jul 2018 | B1 |
10013681 | Oakes, III et al. | Jul 2018 | B1 |
10181087 | Danko | Jan 2019 | B1 |
10235660 | Bueche, Jr. et al. | Mar 2019 | B1 |
20010004235 | Maloney | Jun 2001 | A1 |
20010014881 | Drummond | Aug 2001 | A1 |
20010016084 | Pollard et al. | Aug 2001 | A1 |
20010018739 | Anderson | Aug 2001 | A1 |
20010027994 | Hayashida | Oct 2001 | A1 |
20010037299 | Nichols et al. | Nov 2001 | A1 |
20010042171 | Vermeulen | Nov 2001 | A1 |
20010042785 | Walker | Nov 2001 | A1 |
20010043748 | Wesolkowski et al. | Nov 2001 | A1 |
20010047330 | Gephart | Nov 2001 | A1 |
20010054020 | Barth et al. | Dec 2001 | A1 |
20020001393 | Jones | Jan 2002 | A1 |
20020013767 | Katz | Jan 2002 | A1 |
20020016763 | March | Feb 2002 | A1 |
20020016769 | Barbara et al. | Feb 2002 | A1 |
20020023055 | Antognini et al. | Feb 2002 | A1 |
20020025085 | Gustafson et al. | Feb 2002 | A1 |
20020026418 | Koppel et al. | Feb 2002 | A1 |
20020032656 | Chen | Mar 2002 | A1 |
20020038289 | Lawlor et al. | Mar 2002 | A1 |
20020040340 | Yoshida | Apr 2002 | A1 |
20020052841 | Guthrie | May 2002 | A1 |
20020052853 | Munoz | May 2002 | A1 |
20020065786 | Martens et al. | May 2002 | A1 |
20020072974 | Pugliese | Jun 2002 | A1 |
20020075524 | Blair | Jun 2002 | A1 |
20020084321 | Martens | Jul 2002 | A1 |
20020087467 | Mascavage, III et al. | Jul 2002 | A1 |
20020107767 | McClair et al. | Aug 2002 | A1 |
20020107809 | Biddle et al. | Aug 2002 | A1 |
20020116329 | Serbetcioglu | Aug 2002 | A1 |
20020116335 | Star | Aug 2002 | A1 |
20020118891 | Rudd | Aug 2002 | A1 |
20020120562 | Opiela | Aug 2002 | A1 |
20020120582 | Elston et al. | Aug 2002 | A1 |
20020120846 | Stewart et al. | Aug 2002 | A1 |
20020129249 | Maillard et al. | Sep 2002 | A1 |
20020130868 | Smith | Sep 2002 | A1 |
20020133409 | Sawano et al. | Sep 2002 | A1 |
20020138445 | Laage et al. | Sep 2002 | A1 |
20020138522 | Muralidhar | Sep 2002 | A1 |
20020147798 | Huang | Oct 2002 | A1 |
20020150279 | Scott | Oct 2002 | A1 |
20020150311 | Lynn | Oct 2002 | A1 |
20020152160 | Allen-Rouman et al. | Oct 2002 | A1 |
20020152161 | Aoike | Oct 2002 | A1 |
20020152164 | Dutta | Oct 2002 | A1 |
20020152165 | Dutta et al. | Oct 2002 | A1 |
20020152169 | Dutta | Oct 2002 | A1 |
20020152170 | Dutta | Oct 2002 | A1 |
20020153414 | Stoutenburg et al. | Oct 2002 | A1 |
20020154127 | Vienneau et al. | Oct 2002 | A1 |
20020159648 | Alderson et al. | Oct 2002 | A1 |
20020169715 | Ruth et al. | Nov 2002 | A1 |
20020171820 | Okamura | Nov 2002 | A1 |
20020178112 | Goeller | Nov 2002 | A1 |
20020186881 | Li | Dec 2002 | A1 |
20020188564 | Star | Dec 2002 | A1 |
20020195485 | Pomerleau et al. | Dec 2002 | A1 |
20030005326 | Flemming | Jan 2003 | A1 |
20030009420 | Jones | Jan 2003 | A1 |
20030015583 | Abdi et al. | Jan 2003 | A1 |
20030018897 | Bellis, Jr. et al. | Jan 2003 | A1 |
20030023557 | Moore | Jan 2003 | A1 |
20030026609 | Parulski | Feb 2003 | A1 |
20030038227 | Sesek | Feb 2003 | A1 |
20030050889 | Burke | Mar 2003 | A1 |
20030053692 | Hong et al. | Mar 2003 | A1 |
20030055756 | Allan | Mar 2003 | A1 |
20030055776 | Samuelson | Mar 2003 | A1 |
20030072568 | Lin et al. | Apr 2003 | A1 |
20030074315 | Lam | Apr 2003 | A1 |
20030075596 | Koakutsu | Apr 2003 | A1 |
20030075916 | Gorski | Apr 2003 | A1 |
20030078883 | Stewart et al. | Apr 2003 | A1 |
20030081824 | Mennie | May 2003 | A1 |
20030086615 | Dance et al. | May 2003 | A1 |
20030093367 | Allen-Rouman et al. | May 2003 | A1 |
20030093369 | Ijichi et al. | May 2003 | A1 |
20030102714 | Rhodes et al. | Jun 2003 | A1 |
20030105688 | Brown et al. | Jun 2003 | A1 |
20030105714 | Alarcon-Luther et al. | Jun 2003 | A1 |
20030126078 | Vihinen | Jul 2003 | A1 |
20030126082 | Omura et al. | Jul 2003 | A1 |
20030130940 | Hansen et al. | Jul 2003 | A1 |
20030132384 | Sugiyama et al. | Jul 2003 | A1 |
20030133608 | Bernstein et al. | Jul 2003 | A1 |
20030133610 | Nagarajan et al. | Jul 2003 | A1 |
20030135457 | Stewart et al. | Jul 2003 | A1 |
20030139999 | Rowe | Jul 2003 | A1 |
20030159046 | Choi et al. | Aug 2003 | A1 |
20030167225 | Adams | Sep 2003 | A1 |
20030187790 | Swift et al. | Oct 2003 | A1 |
20030191615 | Bailey | Oct 2003 | A1 |
20030191869 | Williams | Oct 2003 | A1 |
20030200107 | Allen et al. | Oct 2003 | A1 |
20030200174 | Star | Oct 2003 | A1 |
20030202690 | Jones et al. | Oct 2003 | A1 |
20030212904 | Randle et al. | Nov 2003 | A1 |
20030217005 | Drummond et al. | Nov 2003 | A1 |
20030218061 | Filatov | Nov 2003 | A1 |
20030225705 | Park et al. | Dec 2003 | A1 |
20030231285 | Ferguson | Dec 2003 | A1 |
20030233278 | Marshall | Dec 2003 | A1 |
20030233318 | King et al. | Dec 2003 | A1 |
20040010466 | Anderson | Jan 2004 | A1 |
20040012496 | De Souza | Jan 2004 | A1 |
20040013284 | Yu | Jan 2004 | A1 |
20040017482 | Weitman | Jan 2004 | A1 |
20040024626 | Bruijning | Feb 2004 | A1 |
20040024708 | Masuda | Feb 2004 | A1 |
20040029591 | Chapman et al. | Feb 2004 | A1 |
20040030741 | Wolton et al. | Feb 2004 | A1 |
20040044606 | Buttridge et al. | Mar 2004 | A1 |
20040057697 | Renzi | Mar 2004 | A1 |
20040058705 | Morgan | Mar 2004 | A1 |
20040066031 | Wong | Apr 2004 | A1 |
20040069841 | Wong | Apr 2004 | A1 |
20040071333 | Douglas et al. | Apr 2004 | A1 |
20040075754 | Nakajima et al. | Apr 2004 | A1 |
20040076320 | Downs, Jr. | Apr 2004 | A1 |
20040078299 | Down-Logan | Apr 2004 | A1 |
20040080795 | Bean et al. | Apr 2004 | A1 |
20040089711 | Sandru | May 2004 | A1 |
20040093303 | Picciallo | May 2004 | A1 |
20040093305 | Kight | May 2004 | A1 |
20040103057 | Melbert et al. | May 2004 | A1 |
20040103296 | Harp | May 2004 | A1 |
20040109596 | Doran | Jun 2004 | A1 |
20040110975 | Osinski et al. | Jun 2004 | A1 |
20040111371 | Friedman | Jun 2004 | A1 |
20040117302 | Weichert | Jun 2004 | A1 |
20040122754 | Stevens | Jun 2004 | A1 |
20040133511 | Smith et al. | Jul 2004 | A1 |
20040138974 | Shimamura | Jul 2004 | A1 |
20040148235 | Craig et al. | Jul 2004 | A1 |
20040158549 | Matena | Aug 2004 | A1 |
20040165096 | Maeno | Aug 2004 | A1 |
20040170259 | Park | Sep 2004 | A1 |
20040184766 | Kim et al. | Sep 2004 | A1 |
20040201741 | Ban | Oct 2004 | A1 |
20040210515 | Hughes | Oct 2004 | A1 |
20040210523 | Gains et al. | Oct 2004 | A1 |
20040225604 | Foss, Jr. et al. | Nov 2004 | A1 |
20040228277 | Williams | Nov 2004 | A1 |
20040236647 | Acharya | Nov 2004 | A1 |
20040236688 | Bozeman | Nov 2004 | A1 |
20040240722 | Tsuji et al. | Dec 2004 | A1 |
20040245324 | Chen | Dec 2004 | A1 |
20040247199 | Murai et al. | Dec 2004 | A1 |
20040248600 | Kim | Dec 2004 | A1 |
20040252679 | Williams | Dec 2004 | A1 |
20040260636 | Marceau | Dec 2004 | A1 |
20040267666 | Minami | Dec 2004 | A1 |
20050001421 | Luth et al. | Jan 2005 | A1 |
20050010108 | Rahn et al. | Jan 2005 | A1 |
20050015332 | Chen | Jan 2005 | A1 |
20050015342 | Murata et al. | Jan 2005 | A1 |
20050021466 | Buchanan et al. | Jan 2005 | A1 |
20050030388 | Stavely et al. | Feb 2005 | A1 |
20050033645 | Duphily | Feb 2005 | A1 |
20050033685 | Reyes | Feb 2005 | A1 |
20050033690 | Antognini et al. | Feb 2005 | A1 |
20050033695 | Minowa | Feb 2005 | A1 |
20050035193 | Gustin et al. | Feb 2005 | A1 |
20050038746 | Latimer et al. | Feb 2005 | A1 |
20050038754 | Geist | Feb 2005 | A1 |
20050044042 | Mendiola | Feb 2005 | A1 |
20050044577 | Jerding | Feb 2005 | A1 |
20050049950 | Johnson | Mar 2005 | A1 |
20050071283 | Randle et al. | Mar 2005 | A1 |
20050075969 | Nielson et al. | Apr 2005 | A1 |
20050075974 | Turgeon | Apr 2005 | A1 |
20050077351 | De Jong | Apr 2005 | A1 |
20050078336 | Ferlitsch | Apr 2005 | A1 |
20050080725 | Pick | Apr 2005 | A1 |
20050082364 | Alvarez et al. | Apr 2005 | A1 |
20050086140 | Ireland | Apr 2005 | A1 |
20050086168 | Alvarez | Apr 2005 | A1 |
20050089209 | Stefanuk | Apr 2005 | A1 |
20050091161 | Gustin | Apr 2005 | A1 |
20050096992 | Geisel | May 2005 | A1 |
20050097019 | Jacobs | May 2005 | A1 |
20050097046 | Singfield | May 2005 | A1 |
20050097050 | Orcutt | May 2005 | A1 |
20050108164 | Salafia | May 2005 | A1 |
20050108168 | Halpin | May 2005 | A1 |
20050115110 | Dinkins | Jun 2005 | A1 |
20050125338 | Tidwell et al. | Jun 2005 | A1 |
20050125360 | Tidwell et al. | Jun 2005 | A1 |
20050127160 | Fujikawa | Jun 2005 | A1 |
20050131820 | Rodriguez | Jun 2005 | A1 |
20050143136 | Lev et al. | Jun 2005 | A1 |
20050149436 | Elterich | Jul 2005 | A1 |
20050168566 | Tada | Aug 2005 | A1 |
20050171899 | Dunn | Aug 2005 | A1 |
20050171907 | Lewis | Aug 2005 | A1 |
20050177494 | Kelly et al. | Aug 2005 | A1 |
20050177499 | Thomas | Aug 2005 | A1 |
20050177510 | Hilt et al. | Aug 2005 | A1 |
20050177518 | Brown | Aug 2005 | A1 |
20050182710 | Anderson | Aug 2005 | A1 |
20050188306 | Mackenzie | Aug 2005 | A1 |
20050203430 | Williams et al. | Sep 2005 | A1 |
20050205661 | Taylor | Sep 2005 | A1 |
20050209961 | Michelsen | Sep 2005 | A1 |
20050213805 | Blake et al. | Sep 2005 | A1 |
20050216410 | Davis et al. | Sep 2005 | A1 |
20050218209 | Heilper et al. | Oct 2005 | A1 |
20050220324 | Klein et al. | Oct 2005 | A1 |
20050228733 | Bent et al. | Oct 2005 | A1 |
20050244035 | Klein et al. | Nov 2005 | A1 |
20050252955 | Sugai | Nov 2005 | A1 |
20050267843 | Acharya et al. | Dec 2005 | A1 |
20050268107 | Harris et al. | Dec 2005 | A1 |
20050269412 | Chiu | Dec 2005 | A1 |
20050273368 | Hutten et al. | Dec 2005 | A1 |
20050278250 | Zair | Dec 2005 | A1 |
20050281448 | Lugg | Dec 2005 | A1 |
20050281471 | LeConte | Dec 2005 | A1 |
20050281474 | Huang | Dec 2005 | A1 |
20050289030 | Smith | Dec 2005 | A1 |
20050289059 | Brewington et al. | Dec 2005 | A1 |
20050289182 | Pandian et al. | Dec 2005 | A1 |
20060002426 | Madour | Jan 2006 | A1 |
20060004660 | Pranger | Jan 2006 | A1 |
20060015450 | Guck et al. | Jan 2006 | A1 |
20060015733 | O'Malley et al. | Jan 2006 | A1 |
20060017752 | Kurzweil et al. | Jan 2006 | A1 |
20060025697 | Kurzweil | Feb 2006 | A1 |
20060039628 | Li et al. | Feb 2006 | A1 |
20060039629 | Li et al. | Feb 2006 | A1 |
20060041506 | Mason et al. | Feb 2006 | A1 |
20060045321 | Yu | Mar 2006 | A1 |
20060045374 | Kim et al. | Mar 2006 | A1 |
20060045379 | Heaney, Jr. et al. | Mar 2006 | A1 |
20060047593 | Naratil | Mar 2006 | A1 |
20060049242 | Mejias et al. | Mar 2006 | A1 |
20060053056 | Alspach-Goss | Mar 2006 | A1 |
20060059085 | Tucker | Mar 2006 | A1 |
20060064368 | Forte | Mar 2006 | A1 |
20060080245 | Bahl | Apr 2006 | A1 |
20060085357 | Pizarro | Apr 2006 | A1 |
20060085516 | Farr et al. | Apr 2006 | A1 |
20060102704 | Reynders | May 2006 | A1 |
20060103893 | Azimi et al. | May 2006 | A1 |
20060106691 | Sheaffer | May 2006 | A1 |
20060106717 | Randle | May 2006 | A1 |
20060108168 | Fischer et al. | May 2006 | A1 |
20060110063 | Weiss | May 2006 | A1 |
20060112013 | Maloney | May 2006 | A1 |
20060115110 | Rodriguez | Jun 2006 | A1 |
20060115141 | Koakutsu et al. | Jun 2006 | A1 |
20060118613 | McMann | Jun 2006 | A1 |
20060124730 | Maloney | Jun 2006 | A1 |
20060144924 | Stover | Jul 2006 | A1 |
20060144937 | Heilper et al. | Jul 2006 | A1 |
20060144950 | Johnson | Jul 2006 | A1 |
20060159367 | Zeineh et al. | Jul 2006 | A1 |
20060161499 | Rich et al. | Jul 2006 | A1 |
20060161501 | Waserstein | Jul 2006 | A1 |
20060164682 | Lev | Jul 2006 | A1 |
20060166178 | Driedijk | Jul 2006 | A1 |
20060167818 | Wentker et al. | Jul 2006 | A1 |
20060181614 | Yen et al. | Aug 2006 | A1 |
20060182331 | Gilson et al. | Aug 2006 | A1 |
20060182332 | Weber | Aug 2006 | A1 |
20060186194 | Richardson et al. | Aug 2006 | A1 |
20060202014 | VanKirk et al. | Sep 2006 | A1 |
20060206506 | Fitzpatrick | Sep 2006 | A1 |
20060208059 | Cable et al. | Sep 2006 | A1 |
20060210138 | Hilton et al. | Sep 2006 | A1 |
20060212391 | Norman et al. | Sep 2006 | A1 |
20060212393 | Brown | Sep 2006 | A1 |
20060214940 | Kinoshita | Sep 2006 | A1 |
20060215204 | Miyamoto et al. | Sep 2006 | A1 |
20060215230 | Borrey et al. | Sep 2006 | A1 |
20060222260 | Sambongi et al. | Oct 2006 | A1 |
20060229976 | Jung | Oct 2006 | A1 |
20060229986 | Corder | Oct 2006 | A1 |
20060238503 | Smith | Oct 2006 | A1 |
20060242062 | Peterson | Oct 2006 | A1 |
20060242063 | Peterson | Oct 2006 | A1 |
20060248009 | Hicks et al. | Nov 2006 | A1 |
20060249567 | Byrne | Nov 2006 | A1 |
20060274164 | Kimura et al. | Dec 2006 | A1 |
20060279628 | Fleming | Dec 2006 | A1 |
20060282383 | Doran | Dec 2006 | A1 |
20060291744 | Ikeda et al. | Dec 2006 | A1 |
20070002157 | Shintani et al. | Jan 2007 | A1 |
20070013721 | Vau et al. | Jan 2007 | A1 |
20070016796 | Singhal | Jan 2007 | A1 |
20070019243 | Sato | Jan 2007 | A1 |
20070022053 | Waserstein | Jan 2007 | A1 |
20070027802 | VanDeburg et al. | Feb 2007 | A1 |
20070030357 | Levien et al. | Feb 2007 | A1 |
20070031022 | Frew | Feb 2007 | A1 |
20070038561 | Vancini et al. | Feb 2007 | A1 |
20070041629 | Prakash et al. | Feb 2007 | A1 |
20070050292 | Yarbrough | Mar 2007 | A1 |
20070053574 | Verma et al. | Mar 2007 | A1 |
20070058851 | Quine | Mar 2007 | A1 |
20070063016 | Myatt | Mar 2007 | A1 |
20070064991 | Douglas et al. | Mar 2007 | A1 |
20070065143 | Didow et al. | Mar 2007 | A1 |
20070075772 | Kokubo | Apr 2007 | A1 |
20070076940 | Goodall et al. | Apr 2007 | A1 |
20070076941 | Carreon et al. | Apr 2007 | A1 |
20070077921 | Hayashi | Apr 2007 | A1 |
20070080207 | Williams | Apr 2007 | A1 |
20070082700 | Landschaft | Apr 2007 | A1 |
20070084911 | Crowell | Apr 2007 | A1 |
20070086642 | Foth | Apr 2007 | A1 |
20070086643 | Spier | Apr 2007 | A1 |
20070094088 | Mastie | Apr 2007 | A1 |
20070094140 | Riney et al. | Apr 2007 | A1 |
20070100748 | Dheer | May 2007 | A1 |
20070110277 | Hayduchok et al. | May 2007 | A1 |
20070118472 | Allen-Rouman et al. | May 2007 | A1 |
20070122024 | Haas et al. | May 2007 | A1 |
20070124241 | Newton | May 2007 | A1 |
20070127805 | Foth et al. | Jun 2007 | A1 |
20070129955 | Dalmia | Jun 2007 | A1 |
20070131758 | Mejias et al. | Jun 2007 | A1 |
20070136198 | Foth et al. | Jun 2007 | A1 |
20070138255 | Carreon et al. | Jun 2007 | A1 |
20070140545 | Rossignoli | Jun 2007 | A1 |
20070140594 | Franklin | Jun 2007 | A1 |
20070143208 | Varga | Jun 2007 | A1 |
20070150337 | Hawkins et al. | Jun 2007 | A1 |
20070154098 | Geva et al. | Jul 2007 | A1 |
20070156438 | Popadic et al. | Jul 2007 | A1 |
20070168265 | Rosenberger | Jul 2007 | A1 |
20070168283 | Alvarez et al. | Jul 2007 | A1 |
20070171288 | Inoue | Jul 2007 | A1 |
20070172107 | Jones et al. | Jul 2007 | A1 |
20070172148 | Hawley | Jul 2007 | A1 |
20070175977 | Bauer et al. | Aug 2007 | A1 |
20070179883 | Questembert | Aug 2007 | A1 |
20070183000 | Eisen et al. | Aug 2007 | A1 |
20070183741 | Lerman et al. | Aug 2007 | A1 |
20070194102 | Cohen | Aug 2007 | A1 |
20070198432 | Pitroda et al. | Aug 2007 | A1 |
20070203708 | Polycn et al. | Aug 2007 | A1 |
20070206877 | Wu et al. | Sep 2007 | A1 |
20070208816 | Baldwin et al. | Sep 2007 | A1 |
20070214086 | Homoki | Sep 2007 | A1 |
20070217669 | Swift et al. | Sep 2007 | A1 |
20070233525 | Boyle | Oct 2007 | A1 |
20070233585 | Ben Simon et al. | Oct 2007 | A1 |
20070235518 | Mueller et al. | Oct 2007 | A1 |
20070235520 | Smith et al. | Oct 2007 | A1 |
20070241179 | Davis | Oct 2007 | A1 |
20070244782 | Chimento | Oct 2007 | A1 |
20070246525 | Smith et al. | Oct 2007 | A1 |
20070251992 | Sharma et al. | Nov 2007 | A1 |
20070255652 | Tumminaro | Nov 2007 | A1 |
20070255653 | Tumminaro | Nov 2007 | A1 |
20070255662 | Tumminaro | Nov 2007 | A1 |
20070258634 | Simonoff | Nov 2007 | A1 |
20070262137 | Brown | Nov 2007 | A1 |
20070262148 | Yoon et al. | Nov 2007 | A1 |
20070268540 | Gaspardo et al. | Nov 2007 | A1 |
20070271182 | Prakash et al. | Nov 2007 | A1 |
20070278286 | Crowell et al. | Dec 2007 | A1 |
20070288380 | Starrs | Dec 2007 | A1 |
20070288382 | Narayanan et al. | Dec 2007 | A1 |
20070295803 | Levine et al. | Dec 2007 | A1 |
20070299928 | Kohli et al. | Dec 2007 | A1 |
20080002911 | Eisen | Jan 2008 | A1 |
20080010204 | Rackley, III et al. | Jan 2008 | A1 |
20080021802 | Pendelton | Jan 2008 | A1 |
20080040280 | Davis et al. | Feb 2008 | A1 |
20080046362 | Easterly | Feb 2008 | A1 |
20080052182 | Marshall | Feb 2008 | A1 |
20080059376 | Davis | Mar 2008 | A1 |
20080063253 | Wood | Mar 2008 | A1 |
20080065524 | Matthews et al. | Mar 2008 | A1 |
20080068674 | McIntyre | Mar 2008 | A1 |
20080071679 | Foley | Mar 2008 | A1 |
20080071721 | Wang | Mar 2008 | A1 |
20080073423 | Heit et al. | Mar 2008 | A1 |
20080080760 | Ronca | Apr 2008 | A1 |
20080086420 | Gilder et al. | Apr 2008 | A1 |
20080086421 | Gilder | Apr 2008 | A1 |
20080086770 | Kulkarni et al. | Apr 2008 | A1 |
20080091599 | Foss, Jr. | Apr 2008 | A1 |
20080097899 | Jackson et al. | Apr 2008 | A1 |
20080097907 | Till et al. | Apr 2008 | A1 |
20080103790 | Abernethy | May 2008 | A1 |
20080103967 | Ackert et al. | May 2008 | A1 |
20080113674 | Baig | May 2008 | A1 |
20080114739 | Hayes | May 2008 | A1 |
20080115066 | Pavley et al. | May 2008 | A1 |
20080116257 | Fickling | May 2008 | A1 |
20080117991 | Peddireddy | May 2008 | A1 |
20080119178 | Peddireddy | May 2008 | A1 |
20080133411 | Jones et al. | Jun 2008 | A1 |
20080140579 | Sanjiv | Jun 2008 | A1 |
20080147549 | Ruthbun | Jun 2008 | A1 |
20080155672 | Sharma | Jun 2008 | A1 |
20080156438 | Stumphauzer et al. | Jul 2008 | A1 |
20080162319 | Breeden et al. | Jul 2008 | A1 |
20080162350 | Allen-Rouman et al. | Jul 2008 | A1 |
20080162371 | Rampell et al. | Jul 2008 | A1 |
20080177659 | Lacey et al. | Jul 2008 | A1 |
20080180750 | Feldman | Jul 2008 | A1 |
20080205751 | Mischler | Aug 2008 | A1 |
20080208727 | McLaughlin et al. | Aug 2008 | A1 |
20080214180 | Cunningham et al. | Sep 2008 | A1 |
20080219543 | Csulits | Sep 2008 | A1 |
20080245869 | Berkun et al. | Oct 2008 | A1 |
20080247629 | Gilder | Oct 2008 | A1 |
20080247655 | Yano | Oct 2008 | A1 |
20080249931 | Gilder et al. | Oct 2008 | A1 |
20080249951 | Gilder et al. | Oct 2008 | A1 |
20080262950 | Christensen et al. | Oct 2008 | A1 |
20080262953 | Anderson | Oct 2008 | A1 |
20080275821 | Bishop et al. | Nov 2008 | A1 |
20080301441 | Caiman et al. | Dec 2008 | A1 |
20080304769 | Hollander et al. | Dec 2008 | A1 |
20080316542 | Mindrum et al. | Dec 2008 | A1 |
20090024520 | Drory et al. | Jan 2009 | A1 |
20090046938 | Yoder | Feb 2009 | A1 |
20090060396 | Blessan et al. | Mar 2009 | A1 |
20090066987 | Inokuchi | Mar 2009 | A1 |
20090076921 | Nelson et al. | Mar 2009 | A1 |
20090094148 | Gilder et al. | Apr 2009 | A1 |
20090108080 | Meyer | Apr 2009 | A1 |
20090110281 | Hirabayashi | Apr 2009 | A1 |
20090114716 | Ramachandran | May 2009 | A1 |
20090141962 | Borgia et al. | Jun 2009 | A1 |
20090164350 | Sorbe et al. | Jun 2009 | A1 |
20090164370 | Sorbe et al. | Jun 2009 | A1 |
20090166406 | Pigg et al. | Jul 2009 | A1 |
20090167870 | Caleca et al. | Jul 2009 | A1 |
20090171795 | Clouthier et al. | Jul 2009 | A1 |
20090171819 | Emde et al. | Jul 2009 | A1 |
20090171825 | Roman | Jul 2009 | A1 |
20090173781 | Ramachadran | Jul 2009 | A1 |
20090185241 | Nepomniachtchi | Jul 2009 | A1 |
20090185737 | Nepomniachtchi | Jul 2009 | A1 |
20090185738 | Nepomniachtchi | Jul 2009 | A1 |
20090190823 | Walters | Jul 2009 | A1 |
20090192938 | Amos | Jul 2009 | A1 |
20090212929 | Drory et al. | Aug 2009 | A1 |
20090236413 | Mueller et al. | Sep 2009 | A1 |
20090240620 | Kendrick et al. | Sep 2009 | A1 |
20090252437 | Li | Oct 2009 | A1 |
20090254447 | Blades | Oct 2009 | A1 |
20090257641 | Liu et al. | Oct 2009 | A1 |
20090263019 | Tzadok et al. | Oct 2009 | A1 |
20090271287 | Halpern | Oct 2009 | A1 |
20090281904 | Pharris | Nov 2009 | A1 |
20090284637 | Parulski et al. | Nov 2009 | A1 |
20090290751 | Ferman et al. | Nov 2009 | A1 |
20090292628 | Dryer et al. | Nov 2009 | A1 |
20090313167 | Dujari et al. | Dec 2009 | A1 |
20090319425 | Tumminaro et al. | Dec 2009 | A1 |
20090327129 | Collas et al. | Dec 2009 | A1 |
20100007899 | Lay | Jan 2010 | A1 |
20100027679 | Sunahara et al. | Feb 2010 | A1 |
20100030687 | Panthaki et al. | Feb 2010 | A1 |
20100047000 | Park et al. | Feb 2010 | A1 |
20100057578 | Blair et al. | Mar 2010 | A1 |
20100061446 | Hands et al. | Mar 2010 | A1 |
20100078471 | Lin et al. | Apr 2010 | A1 |
20100082468 | Low et al. | Apr 2010 | A1 |
20100082470 | Walach | Apr 2010 | A1 |
20100165015 | Barkley et al. | Jul 2010 | A1 |
20100198733 | Gantman et al. | Aug 2010 | A1 |
20100225773 | Lee | Sep 2010 | A1 |
20100226559 | Najari et al. | Sep 2010 | A1 |
20100260408 | Prakash et al. | Oct 2010 | A1 |
20100262522 | Anderson et al. | Oct 2010 | A1 |
20100274693 | Bause et al. | Oct 2010 | A1 |
20100312705 | Caruso et al. | Dec 2010 | A1 |
20110016084 | Mundy et al. | Jan 2011 | A1 |
20110069180 | Nijemcevic et al. | Mar 2011 | A1 |
20110106675 | Perlman | May 2011 | A1 |
20110112967 | Anderson et al. | May 2011 | A1 |
20110170740 | Coleman | Jul 2011 | A1 |
20110191161 | Dai | Aug 2011 | A1 |
20110251956 | Cantley et al. | Oct 2011 | A1 |
20110280450 | Nepomniachtchi et al. | Nov 2011 | A1 |
20110285874 | Showering et al. | Nov 2011 | A1 |
20110310442 | Popadic et al. | Dec 2011 | A1 |
20120045112 | Lundblad et al. | Feb 2012 | A1 |
20120047070 | Pharris | Feb 2012 | A1 |
20120062732 | Marman et al. | Mar 2012 | A1 |
20120089514 | Kraemling et al. | Apr 2012 | A1 |
20120099792 | Chevion et al. | Apr 2012 | A1 |
20120185383 | Atsmon | Jul 2012 | A1 |
20120185388 | Pranger | Jul 2012 | A1 |
20120229872 | Dolev | Sep 2012 | A1 |
20130021651 | Popadic et al. | Jan 2013 | A9 |
20130120595 | Roach et al. | May 2013 | A1 |
20130198071 | Jurss | Aug 2013 | A1 |
20130223721 | Nepomniachtchi et al. | Aug 2013 | A1 |
20130297353 | Strange | Nov 2013 | A1 |
20140032406 | Roach | Jan 2014 | A1 |
20140067661 | Elischer | Mar 2014 | A1 |
20140236820 | Carlton et al. | Aug 2014 | A1 |
20140279453 | Belchee | Sep 2014 | A1 |
20150039528 | Minogue et al. | Feb 2015 | A1 |
20150090782 | Dent | Apr 2015 | A1 |
Number | Date | Country |
---|---|---|
0 984 410 | Mar 2000 | EP |
20040076131 | Aug 2004 | KR |
WO 9837655 | Aug 1998 | WO |
WO 0161436 | Aug 2001 | WO |
WO 2006075967 | Jul 2006 | WO |
WO 2006086768 | Aug 2006 | WO |
WO 2006136958 | Dec 2006 | WO |
Entry |
---|
Craig Vaream, Image Deposit Solutions, Nov. 2005, JP Morgan Chase, web, 1-13 (Year: 2005). |
“Accept “Customer Not Present” Checks,” Accept Check Online, http://checksoftware.com, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg). |
“Adjusting Brightness and Contrast”, www.eaglesoftware.com/adjustin.htm, retrieved on May 4, 2009 (4 pgs). |
“Best practices for producing quality digital image files,” Digital Images Guidelines, http://deepblue.lib.umich.edu/bitstream/2027.42/40247/1/Images-Best_Practice.pdf, downloaded 2007 (2 pgs). |
“Chapter 7 Payroll Programs,” Uniform Staff Payroll System, http://www2.oecn.k12.oh.us/www/ssdt/usps/usps_user_guide_005.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (9 pgs). |
“Check 21—The check is not in the post”, RedTitan Technology 2004 http://www.redtitan.com/check2l/htm (3 pgs). |
“Check 21 Solutions,” Columbia Financial International, Inc. http://www.columbiafinancial.us/check21/solutions.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs). |
“Check Fraud: A Guide to Avoiding Losses”, All Net, http://all.net/books/audit/checkfraud/security.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg). |
“Clearing House Electronic Check Clearing System (CHECCS) Operating Rules,” An IP.com Prior Art Database Technical Disclosure, Jul. 29, 2015 (35 pgs). |
“Compliance with Regulation CC”, http./www/federalreserve.gov/Pubs/regcc/regcc.htm, Jan. 24, 2006 (6 pgs). |
“Customer Personalized Bank Checks and Address Labels” Checks Your Way Inc., http://www.checksyourway.com/htm/web_pages/faq.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (6 pgs). |
“Deposit Now: Quick Start User Guide,” BankServ, 2007, 2 pages. |
“Direct Deposit Application for Payroll”, Purdue University, Business Office Form 0003, http://purdue.edu/payroll/pdf/directdepositapplication.pdf, Jul. 2007 (2 pgs). |
“Direct Deposit Authorization Form”, www.umass.edu/humres/library/DDForm.pdf, May 2003 (3 pgs). |
“Direct Deposit,” University of Washington, http://www.washington.edu/admin/payroll/directdeposit.html, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs). |
“Electronic Billing Problem: The E-check is in the mail” American Banker—vol. 168, No. 95, May 19, 2003 (4 pgs). |
“First Wireless Handheld Check and Credit Card Processing Solution Launched by Commericant®, MobileScape® 5000 Eliminates Bounced Checks, Enables Payments Everywhere,” Business Wire, Mar. 13, 2016, 3 pages. |
“Frequently Asked Questions” Bank of America, http://www/bankofamerica.com/deposits/checksave/index.cfm?template-lc_faq_bymail, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (2 pgs). |
“Full Service Direct Deposit”, www.nonprofitstaffing.com/images/upload/dirdepform.pdf. Cited in U.S. Pat. No. 7,900,822, as dated 2001, (2 pgs). |
“How to Digitally Deposit a Check Image”, Smart Money Daily, Copyright 2008 (5 pgs). |
“ImageNet Mobile Deposit Provides Convenient Check Deposit and Bill Pay to Mobile Consumers,” Miteksystems, 2008 (2 pgs). |
“It's the easiest way to Switch banks”, LNB, http://www.inbky.com/pdf/LNBswitch-kit10-07.pdf Cited in U.S. Pat. No. 7,996,316, as dated 2007 (7 pgs). |
“Lesson 38—More Bank Transactions”, Turtle Soft, http://www.turtlesoft.com/goldenseal-software-manual.lesson38.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (8 pgs). |
“Middleware”, David E. Bakken, Encyclopedia of Distributed Computing, Kluwer Academic Press, 2001 (6 pgs). |
“Mitek Systems Announces Mobile Deposit Application for Apple iPhone,” http://prnewswire.com/cgi-bin/stories/pl?ACCT=104&STORY=/www/story/10-01- . . . , Nov. 25, 2008 (2 pgs). |
“NOVA Enhances Electronic Check Service to Benefit Multi-Lane Retailers,” Business Wire, Nov. 28, 2006, 2 pages. |
“Personal Finance”, PNC, http://www.pnc.com/webapp/unsec/productsandservice.do?sitearea=/PNC/home/personal/account+services/quick+switch/quick+switch+faqs, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs). |
“Refractive index” Wikipedia, the free encyclopedia; http://en.wikipedia.org./wiki/refractiveindex.com Oct. 16, 2007 (4 pgs). |
“Remote check deposit is the answer to a company's banking problem,” Daily Breeze, Torrance, CA, Nov. 17, 2006, 2 pgs. |
“Remote Deposit Capture”, Plante & Moran, http://plantemoran.com/industries/fincial/institutions/bank/resources/community+bank+advisor/2007+summer+issue/remote+deposit+capture.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs). |
“Remote Deposit” National City, http://www.nationalcity.com/smallbusiness/cashmanagement/remotedeposit/default.asp; Cited in U.S. Pat. No. 7,900,822, as dated 2007 (1 pg). |
“Save on ATM Fees”, RedEye Edition, Chicago Tribune, Chicago, IL Jun. 30, 2007 (2 pgs). |
“SNB Check Capture: SmartClient User's Guide,” Nov. 2006, 21 pgs. |
“Switching Made Easy,” Bank of North Georgia, http://www.banknorthgeorgia.com/cmsmaster/documents/286/documents616.pdf, 2007 (7 pgs). |
“Two Words Every Business Should Know: Remote Deposit,” Canon, http://www.rpsolutions.com/rpweb/pdfs/canon_rdc.pdf, 2005 (7 pgs). |
“Virtual Bank Checks”, Morebusiness.com, http://www.morebusiness.com/running_yourbusiness/businessbits/d908484987.brc, Cited in U.S. Pat. No. 7,900,822, as dated 2007 (3 pgs). |
“WallStreetGrapevine.com” Stocks on the Rise: JADG, BKYI, MITK; Mar. 3, 2008 (4 pgs). |
“What is check Fraud”, National Check Fraud Center, http://www.ckfraud.org/ckfraud.html , Cited in U.S. Pat. No. 7,900,822, as dated 2007 (12 pgs). |
“Exchangeable image file format for digital still cameras: Exif Version 2.2,” Standard of Electronics and Information Technology Industries Associate, JEITA CP-3451, Technical Standardization Committee on AV & IT Storage Systems and Equipments, Japan Electronics and Information Technology Industries Association, Apr. 2002 (154 pgs). (retrieved from: http://www.exif.org/Exif2-2.PDF). |
Affinity Federal Credit Union, “Affinity Announces Online Deposit,” Aug. 4, 2005 (1 pg). |
Albrecht, W. Steve, “Check Kiting: Detection, Prosecution and Prevention,” The FBI Law Enforcement Bulletin, Nov. 1, 1993 (6 pgs). |
Alves, Vander and Borba, Paulo; “Distributed Adapters Pattern: A Design for Object-Oriented Distributed Applications”; First Latin American Conference on Pattern Languages of Programming; Oct. 2001; pp. 132-142; Rio de Janeiro, Brazil (11 pgs). |
Amber Avalona-Butler / Paraglide, “At Your Service: Best iPhone Apps for Military Lifestyle,” Jul. 9, 2010 (2 pgs). |
Anderson, Milton M. “FSML and Echeck”, Financial Services Technology Consortium, 1999 (17 pgs). |
Aradhye, Hrishikesh B., “A Generic Method for Determining Up/Down Orientation of Text in Roman and Non-Roman Scripts,” Pattern Recognition Society, Dec. 13, 2014, 18 pages. |
Archive Index Systems; Panini My Vision X-30 or VX30 or X30 © 1994-2008 Archive Systems, Inc. P./O. Box 40135 Bellevue, WA USA 98015 (2 pgs). |
Associate of German Banks, SEPA 2008: Uniform Payment Instruments for Europe, Berlin, Cited in U.S. Pat. No. 7,900,822, as dated Jul. 2007, Bundesverbankd deutscher banker ev (42 pgs). |
Automated Merchant Systems, Inc., “Electronic Check Conversion,” http://www.automatedmerchant.com/electronic_check_conversion.cfm, 2006, downloaded Oct. 18, 2006 (3 pgs). |
Bank Systems & Technology, Untitled Article, May 1, 2006, http://www.banktech.com/showarticle.jhtml? articleID=187003126, “Are you Winning in the Payment World?” (4 pgs). |
BankServ, “DepositNow: What's the difference?” Cited in U.S. Pat. No. 7,970,677, as dated 2006, (4 pgs). |
BankServ, Product Overview, http://www.bankserv.com/products/remotedeposit.htm, Cited in U.S. Pat. No. 7,970,677, as dated 2006, (3 pgs). |
Bills, Steve, “Automated Amount Scanning Is Trend in Remote-Deposit,” American Banker, New York, NY, Aug. 30, 2005, (3 pgs). |
Blafore, Bonnie “Lower Commissions, Fewer Amenities”, Better Investing, Madison Heights: Feb. 2003, vol. 52, Iss 6, (4 pgs). |
BLM Technologies, “Case Study: Addressing Check 21 and RDC Error and Fraud Threats,” Remote Deposit Capture News Articles from Jun. 11, 2007, Retrieved from http://www.remotedepositcapture.com/News/june_11_2007.htm on Feb. 19, 2008 (5 pgs). |
Blue Mountain Consulting, from URL: www.bluemontainconsulting.com, Cited in U.S. Pat. No. 7,900,822, as dated Apr. 26, 2006 (3 pgs). |
Board of Governors of the federal reserve system, “Report to the Congress on the Check Clearing for the 21st Century Act of 2003” Apr. 2007, Submitted to Congress pursuant to section 16 of the Check Clearing for the 21st Century Act of 2003, (59 pgs). |
Braun, Tim, “Camdesk—Towards Portable and Easy Document Capture,” Image Understanding and Pattern Recognition Research Group, Department of Computer Science, University of Kaiserslautern, Technical Report, Mar. 29, 2005 (64 pgs). (Retrieved from: https://pdfs.semanticscholar.org/93b2/ea0d12f24c91f3c46fa1c0d58a76bb132bd2.pdf). |
Bruene, Jim; “Check Free to Enable In-Home Remote Check Deposit for Consumers and Small Business”, NetBanker. Com, Financial Insite, Inc., http://www. netbanker.com/2008/02/checkfree_to_enableinhome_rem.html, Feb. 5, 2008 (3 pgs). |
Bruene, Jim; “Digital Federal Credit Union and Four Others Offer Consumer Remote Deposit Capture Through EasCorp”, NetBanker—Tracking Online Finance, www.netbanker.com/2008/04/digital_federal_credit_union_a.html, Apr. 13, 2008 (3 pgs). |
Bruno, M., “Instant Messaging, Bank Technology News,” Dec. 2002 (3 pgs). |
Burnett, J. “Depository Bank Endorsement Requirements,” BankersOnline.com, http://www.bankersonline.com/cgi-bin/printview/printview.pl, Jan. 6, 2003 (3 pgs). |
Canon, ImageFormula CR-25/CR-55, “Improve Your Bottom Line with Front-Line Efficiencies”, 0117W117, 1207-55/25-1 OM-BSP, Cited in U.S. Pat. No. 7,949,587 as dated 2007. (4 pgs). |
Carrubba, P. et al., “Remote Deposit Capture: A White Paper Addressing Regulatory, Operational and Risk Issues,” NetDeposit Inc., 2006 (11 pgs). |
Century Remote Deposit High-Speed Scanner User's Manual Release 2006, (Century Manual), Century Bank, 2006, (32 pgs). |
Chiang, Chuck, The Bulletin, “Remote banking offered”, http://bendbulletin.com/apps/pbcs.dll/article?AID=/20060201/BIZ0102/602010327&templ . . . , May 23, 2008 (2 pgs). |
CNN.com/technology, “Scan, deposit checks from home”, www.cnn.com/2008ITECH/biztech/02/07/check.scanning.ap/index.html, Feb. 7, 2008 (3 pgs). |
Constanzo, Chris, “Remote Check Deposit: Wells Captures a New Checking Twist”, Bank Technology News Article—May 2005, www.americanbanker.com/btn_article.html?id=20050502YQ50FSYG (2 pgs). |
Craig, Ben, “Resisting Electronic Payment Systems: Burning Down the House?”, Federal Reserve Bank of Cleveland, Jul. 1999 (4 pgs). |
Creativepaymentsolutions.com, “Creative Payment Solutions—Websolution,” www.creativepaymentsolution.com/cps/financialservices/websolution/default.html, Copyright 2008, Creative Payment Solutions, Inc. (1 pg). |
Credit Union Journal, “The Ramifications of Remote Deposit Capture Success”, www.cuiournal.com/orintthis.html?id=20080411 EODZT57G, Apr. 14, 2008 (1 pg). |
Credit Union Journal, “AFCU Averaging 80 DepositHome Transactions Per Day”, Credit Union Journal, Aug. 15, 2005 (1 pg). |
Credit Union Management, “When You wish Upon an Imaging System . . . the Right Selection Process can be the Shining Star,” Credit Union Management, Aug. 1993, printed from the internet at <http://search.proquest.com/docview/227756409/14138420743684F7722/15?accountid=14 . . . >, on Oct. 19, 2013 (11 pgs). |
DCU Member's Monthly—Jan. 2008, “PC Deposit—Deposit Checks from Home!”, http://www.mycreditunionnewsletter.com/dcu/01 08/page1. html, Copyright 2008 Digital Federal Credit Union (2 pgs). |
De Jesus, A. et al., “Distributed Check Processing in a Check 21 Environment: An educational overview of the opportunities and challenges associated with implementing distributed check imaging and processing solutions,” Panini, 2004, pp. 1-22. |
De Queiroz, Ricardo et al., “Mixed Raster Content (MRC) Model for Compound Image Compression”, 1998 (14 pgs). |
Debello, James et al., “RDM and Mitek Systems to Provide Mobile Check Deposit,” Mitek Systems, Inc., San Diego, California and Waterloo, Ontario, (Feb. 24, 2009), 2 pgs. |
DeYoung, Robert; “The Financial Performance of Pure Play Internet Banks”; Federal Reserve Bank of Chicago Economic Perspectives; 2001; pp. 60-75; vol. 25, No. 1 (16pgs). |
Dias, Danilo et al., “A Model for the Electronic Representation of Bank Checks”, Brasilia Univ. Oct. 2006 (5 pgs). |
Digital Transactions News, “An ACH-Image Proposal for Check Roils Banks and Networks” May 26, 2006 (3 pgs). |
Dinan, R.F. et al., “Image Plus High Performance Transaction System”, IBM Systems Journal, 1990 vol. 29, No. 3 (14 pgs). |
Doermann, David et al., “Progress in Camera-Based Document Image Analysis,” Proceedings of the Seventh International Conference on Document Analysis and Recognition (ICDAR 2003) 0-7695-1960-1/03, 2003, IEEE Computer Society, 11 pages. |
Duvall, Mel, “Remote Deposit Capture,” Baseline, vol. 1, Issue 70, Mar. 2007, 2 pgs. |
eCU Technologies, “Upost Remote Deposit Solution,” Retrieved from the internet https://www.eutechnologies.com/products/upost.html, downloaded 2009 (1 pg). |
EFT Network Unveils FAXTellerPlus, EFT Network, Inc., www.eftnetwork.com, Jan. 13, 2009 (2 pgs). |
ElectronicPaymentProviders, Inc., “FAQs: ACH/ARC, CheckVerification/Conversion/Guarantee, RCK Check Re-Presentment,” http://www.useapp.com/faq.htm, downloaded Oct. 18, 2006 (3 pgs). |
Federal Check 21 Act, “New Check 21 Act effective Oct. 28, 2004: Bank No Longer Will Return Original Cancelled Checks,” Consumer Union's FAQ's and Congressional Testimony on Check 21, www.consumerlaw.org.initiatives/content/check21_content.html, Cited in U.S. Pat. No. 7,873,200, as dated Dec. 2005 (20 pgs). |
Federal Reserve Board, “Check Clearing for the 21st Century Act”, FRB, http://www.federalreserve.gov/paymentsystems/truncation/, Mar. 1, 2006 (1 pg). |
Federal Reserve System, “12 CFR, Part 229 [Regulation CC]: Availability of Funds and Collection of Checks,” Federal Registrar, Apr. 28, 1997, pp. 1-50. |
Federal Reserve System, “Part IV, 12 CFR Part 229 [Regulation CC]: Availability of Funds and Collection of Checks; Final Rule,” Federal Registrar, vol. 69, No. 149, Aug. 4, 2004, pp. 47290-47328. |
Fest, Glen., “Patently Unaware” Bank Technology News, Apr. 2006, Retrieved from the internet at URL:http://banktechnews.com/article.html?id=2006403T7612618 (5 pgs). |
Fidelity Information Services, “Strategic Vision Embraces Major Changes in Financial Services Solutions: Fidelity's long-term product strategy ushers in new era of application design and processing,” Insight, 2004, pp. 1-14. |
Fisher, Dan M., “Home Banking in the 21st Century: Remote Capture Has Gone Retail”, May 2008 (4 pgs). |
Furst, Karen et al., “Internet Banking: Developments and Prospects”, Economic and Policy Analysis Working Paper Sep. 2000, Sep. 2000 (60 pgs). |
Garry, M., “Checking Options: Retailers face an evolving landscape for electronic check processing that will require them to choose among several scenarios,” Supermarket News, vol. 53, No. 49, 2005 (3 pgs). |
German Shegalov, Diplom-Informatiker, “Integrated Data, Message, and Process Recovery for Failure Masking in Web Services”, Dissertation Jul. 2005 (146 pgs). |
Gupta, Amar et al., “An Integrated Architecture for Recognition of Totally Unconstrained Handwritten Numerals”, WP#3765, Jan. 1993, Productivity from Information Technology “Profit” Research Initiative Sloan School of Management (20 pgs). |
Gupta, Maya R. et al., “OCR binarization and image pre-processing for searching historical documents,” Pattern Recognition, vol. 40, No. 2, Feb. 2007, pp. 389-397. |
Hale, J., “Picture this: Check 21 uses digital technology to speed check processing and shorten lag time,” Columbus Business First, http://columbus.bizjournals.com/columbus/stories/2005/03/14focus1.html, downloaded 2007 (3 pgs). |
Hartly, Thomas, “Banks Check Out New Image”, Business First, Buffalo: Jul. 19, 2004, vol. 20, Issue 43, (3 pgs). |
Heckenberg, D. “Using Mac OS X for Real-Time Image Processing” Oct. 8, 2003 (15 pgs). |
Herley, Cormac, “Efficient Inscribing of Noisy Rectangular Objects in Scanned Images,” 2004 International Conference on Image Processing, 4 pages. |
Hildebrand, C. et al., “Electronic Money,” Oracle, http://www.oracle.com/oramag/profit/05-feb/p15financial.html, 2005, downloaded Oct. 18, 2006 (5 pgs). |
Hillebrand, G., “Questions and Answers About the Check Clearing for the 21st Century Act, 'Check 21,” ConsumersUnion.org, http://www.consumersunion.org/finance/ckclear1002.htm, Jul. 27, 2004, downloaded Oct. 18, 2006 (6 pgs). |
Iida, Jeanne, “The Back Office: Systems—Image Processing Rolls on as Banks ReapBenefits,” American Banker, Jul. 19, 1993, printed from the internet at <http://search.proquest.com/docview/292903245/14138420743684F7722/14?accountid= 14 . . . >, on Oct. 19, 2013 (3 pgs). |
Image Master, “Photo Restoration: We specialize in digital photo restoration and photograph repair of family pictures”, http://www.imphotorepair.com, Cited in U.S. Pat. No. 7,900,822, as downloaded Apr. 2007 (1 pg). |
Investment Systems Company, “Portfolio Accounting System,” 2000, pp. 1-32. |
JBC, “What is a MICR Line?,” eHow.com, retrieved from http://www.ehow.com/about_4684793_what-micr-line.html on May 4, 2009 (2 pgs). |
Johnson, Jennifer J., Secretary of the Board; Federal Reserve System, 12 CFR Part 229, Regulation CC; “Availability of Funds and Collection of Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2009, (89 pgs). |
Kendrick, Kevin B., “Check Kiting, Float for Purposes of Profit,” Bank Security & Fraud Prevention, vol. 1, No. 2, 1994 (3 pgs). |
Kiser, Elizabeth K.; “Modeling the Whole Firm: The Effect of Multiple Inputs and Financial Intermediation on Bank Deposit Rates;” FEDS Working Paper No. 2004-07; Jun. 3, 2003; pp. 1-46 (46 pgs). |
Knestout, Brian P. et al., “Banking Made Easy” Kiplinger's Personal Finance Washington, Jul. 2003, vol. 57, Iss 7 (5 pgs). |
Kornai Andras et al., “Recognition of Cursive Writing on Personal Checks”, Proceedings of International Workshop on the Frontiers in Handwriting Recognition, Cited in U.S. Pat. No. 7,900,822, as dated Sep. 1996, (6 pgs). |
Lampert, Christoph et al., “Oblivious Document Capture and Real-Time Retrieval,” International Workshop on Camera Based Document Analysis and Recognition (CBDAR), 2005 (8 pgs). Retrieved from the Internet at: http://www-cs.ccny.cuny.edu/˜wolberg/capstone/bookwarp/LampertCBDAR05.pdf). 8 pgs. |
Levitin, Adam J., Remote Deposit Capture: A Legal and Transactional Overview, Banking Law Journal, p. 115, 2009 (RDC). |
Liang, Jian et al., Camera-Based Analysis of Text and Documents: A Survey, International Journal on Document Analysis and Recognition, Jun. 21, 2005, 21 pages. |
Luo, Xi-Peng et al., “Design and Implementation of a Card Reader Based on Build-In Camera,” Proceedings of the 17th International Conference on Pattern Recognition, 2004, 4 pages. |
Masonson, L., “Check Truncation and ACH Trends—Automated Clearing Houses”, healthcare financial management associate, http://www.findarticles.com/p/articles/mLm3276/is_n7_v47/ai_14466034/print, 1993 (2 pgs). |
Matthews, Deborah, “Advanced Technology Makes Remote Deposit Capture Less Risky,” Indiana Bankers Association, Apr. 2008 (2 pgs). |
Metro 1 Credit Union, “Remote Banking Services,” hltp://ww\\metro1cu.org/metro1cu/remote.html, downloaded Apr. 17, 2007 (4 pgs). |
Mitek systems, “Imagenet Mobile Deposit”, San Diego, CA, downloaded 2009 (2 pgs). |
Mitek Systems: Mitek Systems Launches First Mobile Check Deposit and Bill Pay Application, San Diego, CA, Jan. 22, 2008 (3 pgs). |
Mohl, Bruce, “Banks Reimbursing ATM Fee to Compete With Larger Rivals”, Boston Globe, Boston, MA, Sep. 19, 2004 (3 pgs). |
Moreau, T., “Payment by Authenticated Facsimile Transmission: a Check Replacement Technology for Small and Medium Enterprises,” CONNOTECH Experts-conseils, Inc., Apr. 1995 (31 pgs). |
Nelson, B. et al., “Remote deposit capture changes the retail landscape,” Northwestern Financial Review, http://findarticles.com/p/articles/mi qa3799/is200607/ai_n16537250, 2006 (3 pgs). |
NetBank, Inc., “Branch Out: Annual Report 2004,” 2004 (150 pgs). |
NetBank, Inc., “Quick Post: Deposit and Payment Forwarding Service,” 2005 (1 pg). |
NetDeposit Awarded Two Patents for Electronic Check Process, NetDeposit, Jun. 18, 2007, (1 pg). |
Nixon, Julie et al., “Fiserv Research Finds Banks are Interested in Offering Mobile Deposit Capture as an,” Fiserv, Inc. Brookfield, Wis., (Business Wire), (Feb. 20, 2009), 2 pgs. |
Online Deposit: Frequently Asked Questions, http://www.depositnow.com/faq.html, Copyright 2008 (1 pg). |
Onlinecheck.com/Merchant Advisors, “Real-Time Check Debit”, Merchant Advisors: Retail Check Processing Check Conversion, http://www.onlinecheck/wach/rcareal.htm, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (3 pgs). |
Oxley, Michael G., from committee on Financial Services; “Check Clearing for the 21st Century Act”, 108th Congress, 1st Session House of Representatives report 108-132, Jun. 2003 (20 pgs). |
Oxley, Michael G., From the committee of conference; “Check Clearing for the 21st Century Act” 108th Congress, 1st Session Senate report 108-291, Oct. 1, 2003 (27 pgs). |
Palacios, Rafael et al., “Automatic Processing of Brazilian Bank Checks”. Cited in U.S. Pat. No. 7,900,822, as dated 2002 (28 pgs). |
Patterson, Scott “USAA Deposit@Home—Another WOW moment for Net Banking”, NextCU.com, Jan. 26, 2007 (5 pgs). |
Public Law 108-100, 108 Congress; “An Act Check Clearing for the 21st Century Act”, Oct. 28, 2003, 117 STAT. 1177 (18 pgs). |
Rao, Bharat; “The Internet and the Revolution in Distribution: A Cross-Industry Examination”; Technology in Society; 1999; pp. 287-306; vol. 21, No. 3 (20 pgs). |
Remotedepositcapture, URL:www.remotedepositcapture.com, Cited in U.S. Pat. No. 7,900,822, as dated 2006 (5 pgs). |
RemoteDepositCapture.com, “PNC Bank to Offer Ease of Online Deposit Service Integrated with QuickBooks to Small Businesses”, Remote Deposit Capture News Articles from Jul. 24, 2006, (2 pgs). |
RemoteDepositCapture.com, Remote Deposit Capture News Articles from Jul. 6, 2006, “BankServ Announces New Remote Deposit Product Integrated with QuickBooks” (3 pgs). |
Remotedepsitcapture.com, LLC, “Remote Deposit Capture Overview,” ROC Overview, http://remotedepositcapture.com/overview/RDC_overview.htm, Cited in U.S. Pat. No. 7,900,822, as dated Mar. 12, 2007 (4 pgs). |
Richey, J. C. et al., “EE 4530 Check Imaging,” Nov. 18, 2008 (10 pgs). |
Ritzer, J.R. “Hinky Dinky helped spearhead POS, remote banking movement”, Bank Systems and Equipment, vol. 21, No. 12, Dec. 1984 (1 pg). |
Rivlin, Alice M. et al., Chair, Vice Chair—Board of Governors, Committee on the Federal Reserve in the Payments Mechanism—Federal Reserve System, “The Federal Reserve in the Payments Mechanism”, Jan. 1998 (41 pgs). |
Rose, Sarah et al., “Best of the We: The Top 50 Financial Websites”, Money, New York, Dec. 1999, vol. 28, Iss. 12 (8 pgs). |
Shelby, Hon. Richard C. (Committee on Banking, Housing and Urban Affairs); “Check Truncation Act of 2003”, calendar No. 168, 108th Congress, 1st Session Senate report 108-79, Jun. 2003 (27 pgs). |
SoyBank Anywhere, “Consumer Internet Banking Service Agreement,” Dec. 6, 2004 (6 pgs). |
Teixeira, D., “Comment: Time to Overhaul Deposit Processing Systems,” American Banker, Dec. 10, 1998, vol. 163, No. 235, p. 15 (3 pgs). |
Thailandguru.com: How and where to Pay Bills @ www.thailandguru.com/paying-bills.html, © 1999-2007 (2 pgs). |
The Automated Clearinghouse, “Retail Payment Systems; Payment Instruments Clearing and Settlement: The Automated Clearinghouse (ACH)”, www.ffiec.gov/ffiecinfobase/booklets/retailretail_02d.html, Cited in U.S. Pat. No. 7,900,822, as dated Dec. 2005 (3 pgs). |
The Green Sheet 2.0: Newswire, “CO-OP adds home deposit capabilities to suite of check imaging products”, www.greensheet.com/newswire.php?newswire_id=8799, Mar. 5, 2008 (2 pgs). |
Tygar, J.D., Atomicity in Electronic Commerce, In ACM Networker, 2:2, Apr./May 1998 (12 pgs). |
Valentine, Lisa, “Remote Deposit Capture Hot Just Got Hotter,” ABA Banking Journal, Mar. 2006, p. 1-9. |
Vaream, Craig, “Image Deposit Solutions: Emerging Solutions for More Efficient Check Processing,” JP Morgan Chase, Nov. 2005 (16 pgs). |
Wade, Will, “Early Debate on Remote-Capture Risk,” American Banker, New York, NY, May 26, 2004 (3 pgs). |
Wade, Will, “Early Notes: Updating Consumers on Check 21” American Banker Aug. 10, 2004 (3 pgs). |
Wallison, Peter J., “Wal-Mart Case Exposes Flaws in Banking-Commerce Split”, American Banker, vol. 167. No. 8, Jan. 11, 2002 (3 pgs). |
Wells Fargo 2005 News Releases, “The New Wells Fargo Electronic Deposit Services Break Through Banking Boundaries in the Age of Check 21”, San Francisco Mar. 28, 2005, www.wellsfargo.com/press/3282005_check21Year=2005 (1 pg). |
Wells Fargo Commercial, “Remote Deposit”, www.wellsfargo.com/com/treasury mgmtlreceivables/electronic/remote deposit, Copyright 2008 (1 pg). |
White, J.M. et al., “Image Thresholding for Optical Character Recognition and Other Applications Requiring Character Image Extraction”, IBM J. Res. Development, Jul. 1983, vol. 27, No. 4 (12 pgs). |
Whitney et al., “Reserve Banks to Adopt DSTU X9.37/2003 Format for Check 21 Image Services”, American Bankers Association, May 18, 2004, http://www.aba/com/NR/rdonlyres/CBDC1 A5C-43E3-43CC-B733-BE417C638618/35930/DSTUFormat.pdf (2 pages). |
Wikipedia®, “Remote Deposit,” http://en.wikipedia.org/wiki/Remote_deposit, 2007 (3 pgs). |
Windowsfordevices.com, “Software lets camera phone users deposit checks, pay bills”, www.windowsfordevices.com/news/NS3934956670.html, Jan. 29, 2008 (3 pgs). |
Wolfe, Daniel, “Check Image Group Outlines Agenda,” American Banker, New York, N.Y.: Feb. 13, 2009, vol. 174, Iss. 30, p. 12. (2 pgs). |
Woody Baird Associated Press, “Pastor's Wife got Scammed—She Apparently Fell for Overseas Money Scheme,” The Commercial Appeal, Jul. 1, 2006, p. A. 1. |
Zandifar, A., “A Video-Based Framework for the Analysis of Presentations/Posters,” International Journal on Document Analysis and Recognition, Feb. 2, 2005, 10 pages. |
Zhang, C.Y., “Robust Estimation and Image Combining” Astronomical Data Analysis Software and Systems IV, ASP Conference Series, 1995 (5 pgs). |
Zions Bancorporation, “Moneytech, the technology of money in our world: Remote Deposit,” http://www.bankjunior.com/pground/moneytech/remote_deposit.jsp, 2007 (2 pgs). |
“Quicken Bill Pay”, Retrieved from the Internet on Nov. 27, 2007 at: <URL:http://quickenintuit.com/quicken-bill-pay-jhtml>, 2 pgs. |
Application as filed on Jun. 8, 2018 for U.S. Appl. No.16/018,868, 39 pgs. |
“Start to Simplify with Check Imaging a Smarter Way to Bank”, Retrieved from the Internet on Nov. 27, 2007, at: <URL: http://www.midnatbank.com/Internet%20Banking/internet_Banking.html>, 3 pgs. |
Defendant Wells Fargo Bank, N.A.'s Answer, Affirmative Defenses, and Counterclaims to Plaintiff's Complaint, dated Aug. 14, 2018, 64 pgs. |
Leica Digilux 2 Instructions located on the Internet: http://www.overgaard.dk/pdf/d2_manual.pdf (attached as Exhibit 2 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 95 pgs. |
Sony Digital Camera User's Guide/ Trouble Shooting Operating Instructions, copyright 2005, located on the Internet at: https://www.sony.co.uk/electronics/support/res/manuals/2654/26544941M.pdf (attached as Exhibit 3 from the Defendant Wells Fargo Bank N.A.'s Answer dated Aug. 14, 2018), 136 pgs. |
Panasonic Operating Instructions for Digital Camera/Lens Kit Model No. DMC-L1K, https://www.panasonic.com/content/dam/Panasonic/support_manual/Digital_Still_Camera/English_01-vqt0-vqt2/vqt0w95_L1_oi.pdf (attached as Exhibit 4 from the Defendant Wells Fargo Back N.A.'s Answer dated Aug. 14, 2018), 129 pgs. |
Nikon Digital Camera D300 User's Manual, located on the Internet at: http://download.nikonimglib.com/archive2/iBuJv00Aj97i01y8BrK49XX0Ts69/D300_EU(En)04.pdf (attached as Exhibit 5 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 195 pgs. |
Canon EOS 40D Digital Camera Instruction Manual, located on the Internet at: http://gdlp01.c-wss.com/gds/6/0900008236/01/EOS40D_HG_EN.pdf (attached as Exhibit 6 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 38 pgs. |
Motorola RAZR MAXX V6 User Manual, located on the Internet at: https://www.phonearena.com/phones/Motorola-RAZR-MAXX-V6_id1680, (attached as Exhibit 7 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 36 pgs. |
Motomanual for MOTORAZR, located on the Internet at: https://www.cellphones.ca/downloads/phones/manuals/motorola-razr-v3xx-manual.pdf (excerpts attached as Exhibit 8 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 34 pgs. |
Nokia N95 8GB User Guide, copyright 2009, located on the Internet at: https://www.nokia.com/en_int/phones/sites/default/files/user- guides/Nokia_N95_8GB Extended_UG_en.pdf (excerpts attached as Exhibit 9 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 77 pgs. |
Helio Ocean User Manual, located on the Internet at: https://standupwireless.com/wp-content/uploads/2017/04/Manual_PAN-TECH_OCEAN.pdf (excerpts attached as Exhibit 10 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 76 pgs. |
HTC Touch Diamond Manual, copyright 2008, (attached as Exhibit 11 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 257 pgs. |
Automated Clearing Houses (ACHs), Federal Reserve Bank of New York (May 2000) available at: https://www.newyorkfed.org/aboutthefed/fedpoint/fed31.html, (attached as Exhibit 12 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 4 pgs. |
POP, ARC and BOC—A Comparison, Federal Reserve Banks, at 1(Jan. 7, 2009), available on the Internet at: https://web.archive.org/web/20090107101808/https://www.frbservices.org/files/eventseducation/ pdf/pop_arc_boc_comparison.pdf (attached as Exhibit 13 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 3 pgs. |
David B. Humphrey & Robert Hunt, Getting Rid of Paper: Savings From Check 21, Working Paper No. 12-12, Research Department, Federal Reserve Bank of Philadelphia, (May 2012), available on the Internet at: https://philadelphiafed.org/-/media/research-and-data/publications/working-papers/2012/wp12-12.pdf, (attached as Exhibit 14 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 29 pgs. |
Jeffrey M. Lacker, Payment System Disruptions and the Federal Reserve Following Sep. 11, 2001, The Federal Reserve Bank of Richmond, (Dec. 23, 2003) (attached as Exhibit 19 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 55 pgs. |
Check Clearing for the 21st Century Act Foundation for Check 21 Compliance Training, Federal Financial Institutions Examination Council, (Oct. 16, 2004), available on the Internet at: https://web.archive.org/web/20041016100648/https://www.ffiec.gov/exam/check21/check21foundationdoc.htm, (excerpts attached as Exhibit 20 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 11 pgs. |
Big Red Book, Adobe Systems Incorporated, copyright 2000, (attached as Exhibit 27 from the Defendant Wells Fargo Bank, N.A.'s Answer dated Aug. 14, 2018), 45 pgs. |
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,818,090, dated Nov. 8, 2018, 90 pgs. |
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 9,336,517, dated Nov. 8, 2018, 98 pgs. |
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-20 of U.S. Pat. No. 8,977,571, dated Nov. 8, 2018, 95 pgs. |
Petition filed by Wells Fargo Bank, N.A. for Covered Business Method Review of Claims 1-23 of U.S. Pat. No. 8,699,779, dated Nov. 8, 2018, 101 pgs. |
Declaration of Peter Alexander, Ph.D., CBM2019-0004, Nov. 8, 2018, 180 pgs. |
“Machine Accepts Bank Deposits”, New York Times, Apr. 12, 1961, 1 pg. |
Shah, Moore's Law, Continuous Everywhere But Differentiable Nowhere, Feb. 12, 2009, located on the Internet at: http://samishah.com/2009/02/24/morres-law!, 5 pgs. |
Rockwell, The Megapixel Myth, KenRickwell.com, 2008, located on the Internet at: http://kewrockwell.com.tech/mpmyth.htm, 6 pgs. |
Gates, A History of Wireless Standards, Wi-Fi Back to Basics, Areohive Blog, Jul. 2015, located on the Internet at: http://blog.aerohine.com/a-history-of-wireiess-standards, 5 pgs. |
Apple Reinvents the Phone with iPhone, Jan. 2007, located on the Internet at: https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/, 4 pgs. |
Chen, Brian et al., iPhone 3GS Trounces Predecessors, Rivals in Web Browser Speed Test, Wired, Jun. 24, 2009, located on the Internet at: www.wired.com/2009.3gs-speed/, 10 pgs. |
Berman, How Hitchcock Turned a Small Budget Into a Great Triumph, Time.com, Apr. 29, 2015, located on the Internet at: http://time.com/3823112/alfred-hitchcock-shadow-of-a-doubt, 1 pg. |
Askey, Leica Digilux 2 Review (pts. 1,3,7), Digital Photography Review, May 20, 2004, located on the Internet at: https://www.dpreview.com/reviews/leicadigilux2, 20 pgs. |
Askey, Sony Cyber-shot DSC-R1 Review (pts, 1,3,7), Digital Photography Review, Dec. 6, 2005, located on the Internet at: http://www.dpreview.com.reviews/sonydscr1 , 24 pgs. |
Askey, Panasonic Lumix DMC-L1 Review (pts. 1,3,7), Digital Photography Review, Apr. 11, 2007, located on the Internet at: https://www.dpreview.com/reviews/panasonicdmc11, 24 pgs. |
Askey, Nikon D300 In-depth Review (pts. 1,3,9), Digital Photography Review, Mar. 12, 2008, located on the Internet at: https://www.preview.com/reviews/nikomd300, 24 pgs. |
Askey, Canon EOS 40D Review (pts. 1,4,10), Digital Photography Review, located on the Internet at:http: www.dpreview.com/reviews/canoneos40d, 24 pgs. |
Joinson et al., Olympus E-30 Review (pts. 1,4,8), Digital Photography Review, Mar. 24, 2009, located on the Internet at: www.dpreview.com/reviews/olympus30, 26 pgs. |
Quinn and Roberds, The Evolution of the Check as a Means of Payment: A Historical Survey, Federal Reserve Bank of Atlanta, Economic Review, 2008, 30 pgs. |
Wausau Financial Systems, Understanding Image Quality & Usability Within a New Environment, 2006, 22 pgs. |
iPhone Store Downloads Top 10 Million in First Weekend, Jul. 14, 2008, located on the Internet at: http://www.apple.com/newsroom/2008/07/14iPhone-App-Stire-Downloads-Top-10_Million-in-First-Weekend, 3 pgs. |
Knerr et al., The A2iA lntercheque System: Courtesy Amount and Legal Amount Recognition for French Checks in Automated Bankcheck Processing 43-86, Impedove et al. eds., 1997, 50 pgs. |
149 Cong. Rec. H9289, Oct. 8, 2003, 6 pgs. |
Check Clearing for the 21st Century Act, H. R. Rep. No. 108-132, Jun. 2, 2003, 20 pgs. |
ITU-R-M.1225, Guides for Evaluation of Radio Transmission Technologies for IMT-2000, dated 1997, located on the Internet at: https://www.itu.int/dmspubrec/itu-r/rec/m/R-REC-M,1225-0-199702-I!!PDF-E.pdf, 60 pgs. |
E. MacKenzie, Photography Made Easy, copyright 1845, 80 pgs. |
12 CFR § 229.51 and Appendix D to Part 229 (Jan. 1, 2005 edition), 3 pgs. |
Excerpts from American National Standard for Financial Services, ANS, X9.100-140-2004-Specifications for an Image Replacement Document—IRD, Oct. 1, 2004, 16 pgs. |
Sumits, Major Mobile Milestones—The Last 15 Years, and the Next Five, Cisco Blogs, Feb. 3, 2016, located on the Internet at: https://blogs.cisco.com/sp/mobile-vni-major-mobile-milesrones-the-last15-years-and-the-next-five, 12 pgs. |
Apple Announces the New iPhone 3GS—The Fastest, Most Powerful iPhone Yet, Jun. 8, 2009, located on the Internet at: http://www.apple.com.rensroom/2009/06/08Apple-Annpounces-the-New-iPhone-3GS-The Fastest-Most-Powerful-iPhone-Yet, 4 pgs. |
Motorola, Motomanual, MOTOROKR-E6-GSM-English for wireless phone, copyright 2006, 144 pgs. |
Patent Disclaimer for U.S. Pat. No. 8,699,779, filed on Mar. 4, 2019, 2 pgs. |
Patent Disclaimer for U.S. Pat. No. 8,977,571, filed on Feb. 20, 2019, 2 pgs. |
Patent Disclaimer for U.S. Pat. No. 9,336,517, filed on Mar. 4, 2019, 2 pgs. |
Patent Disclaimer for U.S. Pat. No. 9,818,090, filed on Feb. 20, 2019, 2 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Autornobile Association (USAA)'s Patent Owner Preliminary Response, dated Feb. 20, 2019, 75 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, Declaration of Tim Crews in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, Deciaration of Matthew Caiman in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, Peter Alexander, PhD., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs. |
CBM2019-00002 U.S. Pat. No. 9,818,090, United Services Automobile Association (USAA)'s Updated Exhibit List, dated Mar. 19, 2019,8 pgs. |
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Patent Owner Preliminary Response, dated Mar. 4, 2019, 91 pgs. |
CBM2019-00003 U.S. Pat. No. 8,699,779, Declaration of Matthew Calman in Support of Patent Owner Preliminary Response, dated Mar. 4, 2019, 15 pgs. |
CBM2019-00003 U.S. Pat. No. 9,336,517, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs. |
CBM2019-00003 U.S. Pat. No. 9,336,517, Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs. |
CBM2019-00003 U.S. Pat. No. 9,336,517, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 42.63(e), dated Mar. 19, 2019, 8 pgs. |
CBM2019-00003 U.S. Pat. No. 9,336,517, Petitioner's Reply Brief to Patent Owner Preliminary Response Pursuant to Authorization Provided in Paper No. 14, dated Apr. 10, 2019, 10 pgs. |
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Tim Crews in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 8 pgs. |
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Patent Owner Preiiminary Response, dated Feb. 20, 2019, 99 pgs. |
CBM2019-00004 U.S. Pat. No. 8,977,571, Declaration of Matthew Calmar in Support of Patent Owner Preliminary Response, dated Feb. 20, 2019, 14 pgs. |
CBM2019-00004 U.S. Pat. No. 8,977,571, United Services Automobile Association (USAA)'s Updated Exhibit List Pursuant to 37 CFR 43.63(e), dated Mar. 19, 2019, 8 pgs. |
CBM2019-00005 U.S. Pat. No. 8,699,779, United Services Automobile Association's (USAA)'s Patent Owner Prekninwy Response, dated Mar. 4, 2019, 103 pgs. |
CBM2019-00005 U.S. Pat. No. 8,699,779, Katie Knight Videotape Deposition Transcript, dated Feb. 8, 2019, 27 pgs. |
CBM2019-00005 U.S. Pat. No. 8,699,779 Matthew A. Caiman Declaration, dated Mar. 4, 2019, 15 pgs. |
CBM2019-00005 U.S. Pat. No. 8,699,779 Peter Alexander, Ph.D., Oral and Videotaped Deposition, dated Jan. 23, 2019, 27 pgs. |
CBM2019-00027 U.S. Pat. No. 9,224,136 Declaration of Peter Alexander, Ph,D. dated Mar. 28, 2019, 147 pos. |
CBM2019-00027 U.S. Pat. No. 9,224,136 Petition for Covered Business Method Review of Claims 1-3, 5-9, 11-16 and 18 of U.S. Pat. No. 9,224,136, dated Mar. 28, 2019, 93 pgs. |
CBM2019-00027 U.S. Pat. No. 9,224,136 Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Apr. 8, 2019, 3 pgs. |
CBM2019-00028 U.S. Pat. No. 10,013,681, Plaintiff United Services Automobile Association (USAA) Preliminary Claim Constructions and Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs. |
CBM2019-00028 U.S. Pat. No. 10,013,681, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 94 pgs. |
CBM2019-00028 U.S. Pat. No. 10,013,681, Petition for Covered Business Method Review of Claims 1-30 of U.S. Pat. No. 10,013,681, dated Mar. 28, 2019, 99 pgs. |
CBM2019-00028 U.S. Pat. No. 10,013,681, Petitioner's Updated Exhibit List (as of Apr. 1, 2019) for U.S. Pat. No. 10,013,681, dated Apr. 1, 2019, 5 pgs. |
CBM2019-00028 U.S. Pat. No. 10,013,681, Notice of Filing Date Accorded to Petition and Time for Filing Patent owner Preliminary Response for U.S. Pat. No. 10,013,681, dated Apr. 8, 2019, 3 pgs. |
CBM2019-00029 U.S. Pat. No. 10,013,605, Declaration of Peter Alexander, Ph.D., dated Mar. 28, 2019, 76 pgs. |
CBM2019-00029 U.S. Pat. No. 10,013,605, Petition for Covered Business Method Review of Claims 1-3, 5-14, 16-29 of U.S. Pat. No. 10,013,605, dated Mar. 28, 2019, 88 pgs. |
CBM2019-00029 U.S. Pat. No. 10,013,605, Plaintiff United SeNices Automobile Association (USAA) Preliminary Claim Constructions and Extrinsic Evidence, dated Mar. 15, 2019, 74 pgs. |
IPR2019-00815 U.S. Pat. No. 9,818,090, Petition for Inter Parties Review of Claims 109 of U.S. Pat. No. 9,818,090, dated Mar. 20, 2019, 56 pgs. |
IPR2019-00815 U.S. Pat. No. 9,818,090, Declaration of Peter Alexander, PhD. as filed in the IPR on Mar. 20, 2019, 99 pgs. |
IPR2019-00815 Patent No, 9,818,090, Notice of Filing Date Accorded to Petition and Time for Filing Patent Owner Preliminary Response, dated Mar. 27, 2019, 5 pgs. |
IPR2019-00815 U.S. Pat. No. 9,818,090, Exhibit B Proposed Claim Constructions for the '571, '090, '779 and '517 Patents, filed on Feb. 28, 2019, 10 pgs. |
ABA Routing System Transit Number, Wikipedia, dated Sep. 27, 2006, 3pgs. |
Accredited Standards Committee Technical Report TR 33-2006, dated Aug. 28, 2006, 75 pgs. |
ANS X9.100-140-2004, “Specification for an Image Replacement document—IRD”, American Standard for Financial Services, Oct. 1, 2004, 15 pgs. |
ANSI News, Check 21 Goes Into Effect Oct. 28, 2004, dated Oct. 25, 2004, 1 pg. |
ANSI, “Return Reasons for Check Image Exchange of IRDS”, dated May 6, 2016, 23 pgs. |
ANSI, “Specifications for Electronic Exchange of Check and Image Data”, dated Jul. 11, 2006, 230 pgs. |
ANSI X9.Jul. 1999(R2007), Bank Check Background and Convenience Amount Field Specification, dated Jul. 11, 2007, 86 pgs. |
ASCX9, “Specification for Electronic Exchange of Check and Image Data”, date Mar. 31, 2003. 156 pgs. |
Bankers' Hotiine, “Training Page: Learning the Bank Numbering System”, Copyright 2004, 2 pgs. |
BrainJar Validation Algorithms, archived on Mar. 16, 2016 from BrainJar.com, 2 pgs. |
Canon White Paper, “Two Words Every Business Should Know—Remote Deposit”, dated 2005, 7 pgs. |
CBR online, “Diebold launches ATM depository technology”, Oct. 4, 2007, 5 pgs. |
Cheq information Technology White Paper, “Teller Scanner Performance and Scanner Design: Camera Position Relative to the Feeder”, dated 2005, 7 pgs. |
De Jesus, Angie et al., “Distributed Check Processing in a Check 21 Environment”, dated Nov. 2004, 22 pgs. |
Federal Reserve Adoption of DSTU X9.37-2003, Image Cash Letter Customer Documentation Version 1.8, dated Oct. 1, 2008, 48 pgs. |
Fieiding, R. et al, “RFC-2616—Hypertext Transfer Protocol”, Network Working Group, The lnternet Society copyright 1999, 177 pgs. |
Hill, Simon. “From J-Phone to Lumina 1020: A Complete History of the Camera Phone”, dated Aug. 11, 2013, 19 pgs. |
Instrument—Definition from the Merriam-Webster Online Dictionary, dated Mar. 2, 2019, 1 pg. |
Instrument—Definition of instrument from the Oxford Dictionaries (British & World English), dated Jul. 2, 2017, 44 pgs. |
IPhone Appiication Programming Guide Device Support, dated Apr. 26, 2009, 7 pgs. |
IPhone Announces the New iPhone 3gs—The Fastest, Most Powerful iPhone Yet, Press Release dated Jun. 8, 2009, 4 pgs. |
Klein, Robert, Financial Services Technology, “image Quality and Usability Assurance: Phase 1 Project”, dated Jul. 23, 2004, 67 pgs. |
Lange, Bill, “Combining Remote Capture and IRD Printing, A Check 21 Strategy for Community and Regional Banks”, dated 2005, 25 pgs. |
Lee, Jeanne, “Mobile Check Deposits: Pro Tips to Ensure They Go Smoothly”, dated Feb. 19, 2016, 6 pgs. |
Meara, Bob, “State of Remote Deposit Capture 2015: Mobile is the New Scanner”, Dated May 26, 2015, obtained from the Internet at: https://www.celent.com/insights/57842967, 3 pgs. |
Meara, Bob, “State of Remote Deposit Capture 2015 Mobile Is the New Scanner”, dated May 2015, 56 pgs. |
Meara, Bob,“USAA's Mobile Remote Deposit Capture”, Dated Jun. 26, 2009, 2 pgs. |
Mitek's Mobile Deposit Processes More Than Two Billion Checks, $1.5 Trillion in Cumulative Check Value, dated Mar. 18, 2018, 2 pgs. |
Mitek, “Video Release—Mitek MiSnap™ Mobile Auto Capture Improves Mobile Deposit® User Experience at Ten Financial Institutions”, dated Jul. 15, 2014, 2 pgs. |
NCR, Mobile Remote Deposit Capture (RDC), copyright 2011, 8 pgs. |
Nokia N90 Review Digital Trends, dated Feb. 11, 2019, obtained from the Internet at: https://www.digitaltrends.com/cell-phone-reviews/nokia-n90-review/, 11 pgs. |
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 1 of 3, 67 pgs. |
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 2 of 3, 60gs. |
Nokia N95 8GB User Guide, copyright 2009, (from the Wells Fargo Bank, N.A. IPR2019-00815, filed on Mar. 20, 2019), Part 3 of 3, 53 pgs. |
Patel, Kunur, Ad Age, “How Mobile Technology is Changing Banking's Future”, dated Sep. 21, 2009, 3 pgs. |
Remote Deposit Capture Basic Requirements, dated Aug. 22, 2009, 1 pg. |
Remote Deposit Capture.com Scanner Matrix, dated Oct. 21, 2011, 3 pgs. |
Rowles, Tony, USAA-v. Wells Fargo No. 2:16-cv-245-JRGL e-mail correspondence dated Jan. 24, 2019, 2 pgs. |
Sechrest, Stuart et al., “Windows XP Performance”, Microsoft, dated Jun. 1, 2001, 20 pgs. |
Spenser, Harvey, “White Paper Check 21 Controlling, lmag,e Quality at the Point of Capture”, dated 2004, 7 pgs. |
Timothy R. Crews list of Patents, printed from the United States Patent and Trademark Office on Feb. 13, 2019, 7 pgs. |
Van Dyke, Jim, “2017 Mitek Mobile Deposit Benchmark Report”, copyright 2017, 50 pgs. |
Wausau, “Understanding Image Quality & Usability Within a New Environment”, copyright 2019, 1 pg. |
Whitney, Steve et al., “A Framework for Exchanging Image Returns”, dated Jul. 2001 , 129 pgs. |
Number | Date | Country | |
---|---|---|---|
62167754 | May 2015 | US |