The present invention relates to sales billing. More specifically, the present invention pertains to invoices, sales receipts, and related transaction documents for business sales; the sales transactions may be transacted person-to-person for deliveries at retailer locations, factory locations, shop locations, and similar. The invention further relates to a system and method to eliminate printed paper invoices and printed sales receipts by employing secure, validated transmission of a purely digital/electronic bill (e-bill) of sale and/or sales receipt.
Workflow Applications.
Workflow applications are software applications which automate some or all of the steps of a business process. Typically, most business processes have some steps which require human input, such as entry of specific text or selection of specific values. Other steps can be automated and so be handled by the software application. For example, once a user has selected a product, the software may automatically insert a price; or once a user has selected or typed in a purchaser, a previously recorded purchaser address may be automatically filled in.
Product ordering, finance, and billing/payment processes are typically good candidates for workflow applications. For example, a purchase order within a single company may require authorizations from multiple departments. Such a purchase order can be initiated by a requesting party within the company, and then be forwarded electronically, via software, through multiple departments within the company. The requester of the purchase order can give the authorization when all required departments have approved. If a workflow application is strictly internal to one company, the integrity and the validity of the data processed by the application can often be readily maintained by a single workflow application (or by multiple applications which are designed as an integrated “team” of software elements).
Sales and Billing
Small sales and service transactions are often done “in the field”, for example at a store where a product or service company (or a delivery service working on their behalf) delivers a product to a commercial enterprise or to a home. Such sales transactions typically involve hardcopy paperwork—for example, a bill of sales, a receipt, and/or electronic fund transfer authorizations (for example, credit cards or electronic checks). In such cases, both parties (seller and purchaser) typically need to exchange and retain valid copies of the pertinent documents.
These product/service sales and product/service billing situations provide another business context where workflow applications may be appropriate and helpful. A complication arises in such cases, however, because here the paperwork (which may be substantial in volume) is typically between different companies or organizations—typically between a purchaser and a seller. As a result, the paperwork—whether actual paper, or digital “paperwork”—is not internal to any one organization. Creating copies for each separate party, and establishing, maintaining, and validating the data integrity and security of paperwork exchanged between different companies, may then be a significant challenge.
Direct Store Delivery (DSD).
As noted above, paperwork exchanges may occur outside of an office context, adding a further challenge to maintaining data integrity and validation. Consider for example direct store delivery (DSD) applications. In deliveries from a distributor or product manager directly to a store (or home consumer or other off-business-site customer), purchase orders, bills, and payments may all be exchanged at a consumer's front door, at a loading dock, or in the lobby/reception area of an office or factory.
In DSD applications, the DSD driver provides the bill (invoice) of delivered material to the store keepers (or retailers, or home consumer). That bill amount will be paid instantly or later. Store keepers (or retailers) may also provide some kind of receipts to DSD drivers. These bills and receipts exchange happen in-the-field, at retailer location.
In all these transactions there is great deal of paper work involved, and typically both the parties exchange their bills physically. For customary record-keeping purposes, DSD suppliers (or DSD drivers) should maintain all these bills (invoices) and receipts for about ten to twenty years.
DSD as an Example of Workflow Applications:
In this document, many examples, applications, and exemplary methods may be based on exemplary Direct Store Delivery (DSD) use cases or DSD contexts. It will be noted that DSD examples and use cases are employed for purposes of illustration. The present system and method may be applied broadly to many different applications and contexts where transactional documents are employed, agreement documents are employed, or more generally to many types of workflow paperwork. Similarly, throughout this document, references to DSD persons or contexts, such as “DSD drivers”, may be understood to refer more broadly to any personnel engaged in workflow activities.
Advantages of Paper Bills and Receipts (Hardcopy)
Still paper bills and receipts are used due to the following advantages:
Disadvantages of Paper Bills and Receipts (Hardcopy)
Paper bills, paper receipts, and other physical hardcopy have distinct, substantial disadvantages as well.
Electronic Scanning and Storage (Softcopy)
Pure usage of paper bills and paper receipts has the disadvantages noted immediately above. For this reason it has become common to have paper documents scanned and stored electronically. However, this solution presents problems as well.
Cost of scanners, printers, papers: In Workflow applications, such as direct sales delivery (DSD), both the parties to the transaction (bill distributor and bill receivers) should have the scanners and printers at the location where business is transacted, for example at individual stores. There is some cost involved in purchase of scanners, printers, toner, and the paper itself.
Carrying scanners, printers, and papers: In some of the workflow applications, the bill distributor (typically the seller or delivery service for a product or service) needs to bring along scanners, printers, and papers during business travel. Carrying and handling of these items (along with the actual goods for sale) adds an extra burden for the delivery or service personnel.
Not environmentally friendly: Scans originate from paper. Using paper is not good for the environment, since paper is made from trees.
Tampering: Scanned copies, such as Portable Document Format (PDF) copies of printed bills and receipts, may be easily altered. For example, a scanned copy could be altered before transmission from one location to another, with the recipient having no way to know the PDF file was altered. In some embodiments, the present system and method addresses this deficiency.
Bill searching: Scanned copies are typically image copies, and cannot be readily searched for text or numeric data. Text recognition may be performed on such documents, but this requires extra processing. Further, handwritten information, as may be written on a print document by persons engaged in a sales transaction, often does not lend itself to reliable text recognition.
Transaction Validation Challenges
Sales documents, bills, and receipts are typically validated by the parties involved in the transactions. This conventionally involves signatures, which may be done by pen-and-ink on paper, and which can then be scanned; validation is also done increasingly by signature by stylus on a contact sensitive electronic display screens/tablets. Even these approaches have disadvantages, especially for documents which must be transmitted from one party to another.
Signature validation: Signatures captured from paper by a scanner will not be same as actual signatures (since they lack the indentations made in paper by actual signature), and so may be less reliable for signature validation. Signatures which are originally captured electronically on tablets could easily be digitally “swapped” with false signatures by a malicious party.
Tampering: Softcopy, such as PDF scans of printed and hand-signed documents, can be tampered and altered easily (with the possible exception of cases where digital signatures are not used).
Digital Signatures and Digital Watermarks
Another method used for document validation is a digital signature, which is a mathematical process to demonstrate the authenticity of digital documents (for example, to authenticate PDF scans of print bills and receipts). Digital signatures can be validated mathematically by a recipient, which provides the recipient with a degree of confirmation the message was created by a known sender (authentication), that the sender actually sent the message (non-repudiation), and that the message was not altered in transit (integrity).
Here again, however, there are disadvantages with digital signatures.
What is needed, then, is a system and method for recording financial transaction documents (such as sales bills and receipts) in the field at the point of sale/transaction, at the time of sale/transaction. The system and method should be entirely digital, avoiding paper altogether; and once the document is sent electronically, the system and method should provide for robust data validation at the receiving end. The system and method should also be practical, convenient and robust, employing digital validation processes which enable secure duplication and electronic transmission of the documents in a manner which is highly reliable, readily validated, and relies essentially only on data inherent in the transaction itself, without the need for third-party keys or validation algorithms (which can be modified for various reasons).
Accordingly, in one aspect, the present invention embraces both portable hardware (tablet computers and/or cell phones, and similar) in conjunction with software, which completely replaces paper billing and receipts for DSD and for similar applications where paper or paperless documents are required to establish, confirm, or support some kind of mutual agreement between two parties. The mutual agreement may entail numeric data pertaining to finance or sales, but may also be any other kinds of agreements between two parties, and which has unique document data.
The system and method allows for paperless billing and paperless receipt document exchanges, with inherent document verification, between any two parties (such as seller and purchaser) who are at same location. The system and method enables the parties to exchange the transaction data through Bluetooth, USB or any wireless or wired communication. This system and method even works if the parties want to share data (that is, bills, receipts) by taking pictures of documents (for example, taking pictures via their cell phones).
In an embodiment, the system and method entails employing the unique data values which are inherent in a particular transaction document (such as data from a bill of sale) to create to visual representations such as two-dimensional (2-D) bar codes or matrix codes.
The document data is first encrypted to ciphertext according to a password which itself depends on the unique document data. A first 2-D barcode is then generated, which directly represents the document data, according to standard matrix coding algorithms. A second 2-D barcode represents the document data in the encrypted form. This creates a unique visual encoding/encryption specific to the particular transaction document.
Finally, the two 2-D barcodes are overlapped to form a single, color visual barcode-like representation, employing multiple colors for various combinations of overlapped cells, according to a cell mixing algorithm.
At a receiving end, the two original 2-D barcodes can only be separated by software which employs a proprietary, reverse algorithm to extract the two black and white 2-D barcodes from the combined color code. One of the two retrieved 2-D barcodes, which stored the document data in plaintext form, is then used to retrieve the original document data. The other retrieved matrix symbol has the ciphertext document data.
The original document data is then employed to generate a second ciphertext 2-D barcode, according to the same algorithm employed above. If the two 2-D barcodes—the original and the newly generated—are a match, this indicates that data integrity has been maintained. If the two 2-D barcodes—original and newly generated—are not a match, this indicates that data integrity has been lost or tampered with.
In an embodiment, the system and method employs fingerprints from the transaction participants for document signatures or validation. The system and method entails employing the unique data values for the particular transaction document (such as data from a bill of sale) to generate a unique fingerprint shuffling sequence.
Each fingerprint, from each transaction participant (for example, seller and buyer), is broken into multiple image parts. For each fingerprint, these images parts are then spatially shuffled according to the unique fingerprint shuffling sequence.
Finally, the two shuffled fingerprint images are combined into a single, overlapping visual representation, employing multiple colors, according to a proprietary image overlap algorithm.
At a receiving end, the fingerprint shuffling sequence can be recovered from the original document data. At the receiving end, the two original shuffled fingerprint images can then be separated by software which employs a proprietary, reverse shuffling algorithm. This employs a reverse shuffling sequence to extract the two original black and white fingerprints from the combined, color shuffled fingerprint. The fingerprints can then be validated against separately stored records of fingerprints of the appropriate parties.
In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments. However, one skilled in the art will understand that the invention may be practiced without these details. In other instances, well-known structures associated with computers, tablets, cell phones, or with other digital devices, and/or with data display, and/or with data storage or data transmission, have not been shown or are not described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and the claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open sense, that is as “including, but not limited to.”
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
Reference Numbers: Reference numbers are used throughout the figures. The first digit of a reference number generally indicates the first drawing where the associated element appears. For example, an element with reference number “207” first appears in
In some instances, an element may be shown in both a generic form and a more specific form or species; in these cases, the specific form or species may be indicated by an appended period (“.”) followed by a digit or digits to distinguish a species of the general form. For example, a general fingerprint image may have a reference number of 605; while a first fingerprint image of a first person may have a reference number 605.1, and a second fingerprint image of a second person may have a reference number of 605.2.
In this document, an exemplary sales invoice is used to as an example to illustrate an exemplary transaction document. Persons skilled in the relevant arts will appreciate that the present system and method may be applied to any document which needs to be protected from tampering.
Portable Data Receiving and Display System (Tablet)
The present system and method may enable recording of date pertaining to workflow transactions, including financial and sales data such as sales transaction data, at a field location such as a retail store. Sales transactions are just one exemplary case of many possible applications of the present system and method. The present system and method is not limited to sales, and persons skilled in the relevant arts will appreciate that the present system and method can be employed with and applies to many kinds of workflow transactions.
The present system and method may employ a portable computing system or processing system, such as a tablet computer 100, to record, store, display, and transmit data. The data may include both numeric and text data, and also graphical or image data, such as an image of a human fingerprint captured via a biometric scanner. The biometric scanner may be an internal module, or may be an external (stand-alone) device which may be coupled to the tablet computer 100 via a wired or wireless link.
Other related technologies may be employed as well, including: a dedicated portable electronic system for sales, invoicing, and receipts; cellular phones; smart phones; personal digital assistants; vehicle-mounted computers; stationary (e.g., desktop) computers; or distributed computers (e.g., network systems).
Tablet 100 may employ static memory 109 for long-term storage of operating instructions, an operating system, and long-term data storage. Static memory 109 may include Read-Only Memory (ROM), also known as non-volatile memory (NVRAM), a hard disk drive, flash drives and other removable-but-non-transitory storage, CD ROMs, PC-CARDs, memory cards, and/or other non-transitory storage media and code storage media as may be developed in the future.
Tablet 100 may also include one or more wireless communication systems 115. These wireless systems 115 may enable communication with other electronic devices and with computer networks via wireless networks according to such protocols as Bluetooth, WiFi, cellular communications, a local area network (LAN), an ad hoc network, a cellular network (e.g., a GSM network, a CDMA network, or an LIE network), and other protocols and wireless systems well known in the art or yet to be developed.
Tablet 100 may also include ports (not shown in
Tablet 100 according to the present disclosure may include a display monitor 120 for display of user data and graphical images, and also for display of elements such as menus, dialog boxes, and display buttons which enable or facilitate user control of the present system and method.
Tablet 100 may include a keyboard 130 for entry of text and numeric data. In an alternative embodiment, display and keyboard functions may be integrated into a touch-screen (or pressure-sensitive) display 120 as is well known in the art.
Tablet 100 may include a mouse or touch-pad 140, also well-known in the art, for further control of a display pointer and other elements visible on monitor 120. In an alternative embodiment, mouse functions may be integrated into a touch-screen display 120 as well.
Tablet 100 may include audio user-interface elements 150 as well. These may include speakers, headphones, and/or a microphone. The microphone may be configured to receive verbal input from a user, and suitable software or firmware may translate the audio input into text and numeric data to be displayed on display 120, or may translate the audio input into control commands to operate tablet 100 according to the present system and method.
Other elements and means for user input and output, such as eye-glasses with optical tracking and interaction capabilities, or holographic display and interaction systems, may also be envisioned within the scope and spirit of the present system and method.
Tablet 100 may include a biometric sensor 160. Biometric sensor 160 may be used to identify a user or users of device 100 based on unique physiological features. In an embodiment, biometric sensor 160 may be a fingerprint sensor used to capture an image of a user's fingerprint. In an alternative embodiment, biometric sensor 160 may be an iris scanner or a retina scanner configured to capture an image of a user's iris or retina. As noted above, in an embodiment biometric sensor 160 may be an internal module of tablet 100. In an alternative embodiment, biometric sensor 160 may be an external (stand-alone) device which may be coupled to tablet computer 100 via a wired link (e.g., USB) or wireless link (e.g., Bluetooth connection).
In an embodiment of the present system and method, tablet 100 may include a camera 170 for capture of images. In an embodiment, camera 170 may be used to capture images of documents, such as sales transaction documents, which may be displayed on paper or by other electronic devices. In an embodiment, camera 170 may serve multiple roles, including serving effectively as biometric sensor 160 for the capture of user fingerprints, iris scans, retinal scans, or even facial images of parties to a sales transaction.
The processor 105 is communicatively coupled via a system bus 195 to memory 107, NVRAM 109, wired connections 111, wireless transceivers 115, monitor 120, keyboard 130, mouse 140, audio I/O 150, biometric sensor 160, camera 170, and to such other hardware devices as may be necessary or helpful to implement the present system and method. Tablet 100 includes suitable electronic support for bus 195 to mediate interactions between all elements of device 100.
Typically, processor 105 is configured to execute instructions and to carry out operations associated with the tablet 100. For example, using instructions retrieved from the memory 107 (e.g., a memory block) and/or static memory 109, processor 105 may control the reception and manipulation of input and output data between internal components of the tablet 100. Processor 105 typically operates with an operating system to execute computer code and to produce useful data. The operating system, other computer code, and data may reside within memory 107 and/or memory 109 that is operatively coupled to processor 105 via bus 195.
The operating system, other computer code, and data may also be hard-coded into tablet 100 either as dedicated logic within processor 105 or as non-volatile memory known as firmware 109.
In an embodiment, the instructions and data employed by the electronic device may be organized into one or more software modules, firmware modules, drivers, or other programs. Such modules may be implemented, in whole or in part, as one or more of: dedicated logic in processor 105; firmware 107; and dedicated, specialized processors (not shown in
Exemplary modules which may be employed may include:
It will be understood by persons skilled in the relevant arts that the above indicated modules may employ any of several algorithms and methods well known in the art, or may employ new or novel methods yet to be developed.
Transaction Documents (Bills, Receipts, Contracts)
The present system and method is explained herein largely with reference to an exemplary sales transaction and in particular an exemplary sales invoice 200. This is to be understood as being by way of example and illustration only, and should not be construed as limiting. The present system and method is applicable to many different types of documents and records which record or in other ways pertain to or may be associated with mutual agreements between two parties, including for example and without limitation: contracts, bids, licensing agreements, proposals, technical specifications, requirements documents, service agreements, parts lists, deeds, liens, letters of understanding, title documents, receipts, easements, and covenants.
Template 200 may include various display buttons, menus and other user-interface elements, not shown in
In practical, real-world application, each transaction bill (invoice) or receipt will likely have some kind of unique and critical information. Normally each company maintains the predefined bill and receipt templates 200 (or bill and receipt formats 200) so that new transactions can be defined in the field, as needed.
Multiple unique transaction values 240 may be combined into a single data set which is itself unique. Further, through various algorithms, a unique data set may be employed to generate a unique encryption key or password (see
Examples of transaction data 240 which are typically essential to each transaction, and which are unique or distinctive may include, for example without limitation:
Persons skilled in the art will recognize that some such data may not be universally unique, but will have widely varying values across many transactions. For example, a Total Amount Due 240.3 may occasionally have the same exact value in two completely separate transactions, but will generally vary greatly from one transaction to another, and may have a very large range of possible values.
Other invoice values may be common between multiple transactions, but may be unique between different vendors or customers. They may be referred to as “semi-unique transaction values” or “distinctive transaction values”. These collectively may still be used as elements of a combined data set which is, in totality, unique among all commercial transactions. Additional invoice values which may be employed within the scope of the present system and method include, for example without limitation:
The above identified transaction values 240 are exemplary only. Other billing, receipt, and transaction values 240 may be employed as well on transaction template 200. These exemplary values 240, and similar values, are the parameters are identified in transaction template 200. These parameters 240 may be employed for generating passwords and some of other required elements of the present system and method, as discussed further below.
Generation of a Document-Unique Password and Shuffling Sequence
In step 305 of method 300, designated critical transaction data values 240 are extracted from a sales invoice, bill, receipt, or similar transaction document 200. Exemplary transaction values 240 were discussed above in conjunction with
In step 310 of method 300, the extracted data values 240 are provided as input parameters 240 to suitable algorithms 350 for key generation. Key generation algorithms 350 may include, for example and without limitation a password generating algorithm 350.1 or an image shuffling sequence algorithm 350.2. Other algorithms may be employed as well.
In an embodiment, the present system and method employs one or more unique (that is, proprietary, not-publicly known, not-commercially available, not open-source, etc.) key generation algorithms 350 for enhanced data security. General design methods and principles for the development of such algorithms are known in the art, and so specific, possible proprietary algorithms 350 are not discussed here.
In an alternative embodiment, the present system and method employs one or more publicly known, commercially available, or open-source key generation algorithms 350.
In an alternative embodiment, multiple encryption algorithms 350 may be employed (with the same algorithms or different algorithms), applied for example consecutively, and employing either of general (known) encryption algorithms or proprietary encryption algorithms, or both.
In step 315 of method 300, the algorithm(s) 350 output one or more unique keys or key sequences 360,365 which are to be employed for data encryption, as described further below in this document. Such keys may include, for example and without limitation:
In an embodiment of the present system and method, additional or alternative sequences or patterns may be generated as well from the unique document data. For example, in addition to shuffling sub-parts of a biometric image, parts or segments of a biometric image may be flipped (horizontally and/or vertically) as well, or rotated; the present system and method may generate suitable flip or rotation sequences, or other image alteration patterns, as required, from the unique document data.
Generation of 2-D barcodes for Transaction Data (Plaintext and Ciphertext)
In an embodiment, the present system and method generates two visual representations of transaction data 240. The first geometric representation 410.1 is of the data 240 in its native (unencrypted or plaintext form), while the second geometric representation 410.2 is of the data 240 in an encrypted form (also referred to as “ciphertext” 425, see
In an embodiment, the geometric representation 410 is in the form of a two-dimensional barcode. In an embodiment, the geometric representation 410 is in two colors only, or “monochrome”, the two colors often being (as is conventional in the art) black and white. More generally, in an embodiment, the geometric representation 410 is in two colors only, or “monochrome”, one being a first high intensity color and the second being a second low intensity color, so as to provide a strong intensity contrast for clarity of scanning.
In an embodiment of the present system and method, the 2-D barcodes 410 may be QR codes 410. In alternative embodiments the geometric representations may take other forms well known in the art, such as Aztec codes, Data Matrix, MaxiCode, PDF 417, stacked (multi-row) barcodes, or other such two-dimensional geometric codes known or yet to be developed.
In this document, the terms “barcode” 410, “two dimensional barcode” 410, “2-D barcode” 410, “geometric data representation” 410, “matrix code” 410, and “QR code” 410 are used interchangeably to reflect geometric representations of alphanumeric data 240 which are typically two-dimensional, though in some embodiments may be strictly linear (one dimensional).
2-D barcode for Plaintext (Unencrypted) Data:
In an embodiment, the transaction data 240 is then processed according to known algorithms 405 for generating 2-D bar codes 410, such as a QR code or other matrix code, stacked barcode, etc. In an alternative embodiment, a proprietary conversion algorithm 405 may be employed to generate a proprietary 2-D barcode 410. In
The result is a first 2-D barcode 410.1, which encodes the plaintext transaction data 240, that is, the transaction data without encryption. Persons skilled in the relevant arts will recognize that the transaction data 420 may be readily recovered from 2-D barcode 410 through decoding algorithms generally known in the art, or through a proprietary decoding algorithm for a proprietary 2-D barcode.
2-D Barcode for Ciphertext (Encrypted) Data:
In an embodiment, the unencrypted transaction data 240 (generically referred to in the art as “plaintext”) is encrypted using an encryption algorithm 420 or algorithms 420. In an embodiment, the encryption algorithm 420 will employ, as a key or password, the encryption password 360 previously calculated from the transaction data 240 (see
The output result is an encrypted transaction data 425 (generically referred to in the art as “ciphertext”), which may be ciphertext invoice data 425, ciphertext receipt data 425, or other ciphertext transaction data 425. In an embodiment, the ciphertext data 425 is generally not understandable or meaningful as presented, and in an embodiment (“private key”) can generally only be read if first decrypted employing the same encryption password 360.
In an embodiment, the encryption algorithm 420 may be any of several known algorithms, including for example and without limitation: Pretty Good Privacy (PGP), Triple DES, RSA, Blowfish, Twofish, and AES. In an alternative embodiment of the present system and method, a proprietary encryption algorithm 420 may be employed.
In an alternative embodiment of the present system and method, additional encryption passwords 460 may be employed in addition to the encryption password 360 calculated from transaction data 240. In an embodiment, one or more additional encryption passwords 460 may be calculated based on the same unique transaction data, but employing different password generating algorithms 360. Other sources of additional encryption passwords 460 may be envisioned as well, including for example and without limitation a password which has been privately shared (via other systems and methods, such as e-mail or secure network connection) between the two parties to a sales transaction.
The ciphertext transaction data 425 is then processed according to known algorithms 405 for generating a 2-D barcode 410.2, such as a QR code. In an alternative embodiment, a proprietary conversion algorithm 405 may be employed to generate a proprietary 2-D barcode 410.2. In
The result is a second 2-D barcode 410.2, which encodes the ciphertext transaction data 425.
Merging/Combining the Two 2-D Barcodes into One Image
In an embodiment of the present system and method, the two 2-D barcodes generated above—2-D barcode 410.1 representing the plaintext transaction data, and 2-D barcode 410.2 representing the encrypted transaction data (ciphertext)—are merged into a single combined 2-D barcode image 510, referred to equivalently and interchangeably herein as the “merged 2-D barcode image” 510, “merged image” 510, “merged matrix code image” 510, “multicolored matrix code” 510, and similar terms.
The combined 2-D barcode image 510, or merged image 510, is generated in such a way that a third-party who is viewing or parsing the merged image 510 would generally not be able to extract any data from the image 510. This is because a third-party would require access to a proprietary algorithm to reconstruct the two original 2-D barcodes 410. At the same time, with access to the appropriate algorithm, the two initial 2-D barcodes 410 can later be recovered from the merged image 510.
In an embodiment, plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 are graphically merged by combining each corresponding cell from the two source 2-D barcodes 410, and generating a designated color cell 515 for different combinations of source cells. “Corresponding cells” are cells which share a same coordinate position; for examples, cells both in row X, column Y of plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 are corresponding cells.
Understood another way, plaintext 2-D barcode 410.1 may be overlapped directly on top of ciphertext code 410.2 (or vice versa). The overlap of any two source cells generates a cell combination 515 which may be assigned a specific color.
In an exemplary embodiment illustrated in
The above combinations, as illustrated in
In an alternative embodiment, a designated cell combination could be randomly mapped to any of several designated color options for merged cells. For example, a merging 515.3 of a white cell over a black cell could be mapped randomly to any of white, red, or brown. For another example, a merging 515.4 of a black cell over a white cell could be mapped randomly to any of blue, orange, or violet. For third-parties who do not know the proprietary mapping algorithm, the use of additional possible output colors may further confound any efforts to “unmerge” or separate the merged image 510 into the original two 2-D barcodes 510.
The output result of the process is the merged, 2-D barcode 510 (which in
Sizes (Cell Dimensions) of the 2-D Barcodes:
It will be noted that the method 500 may be applied to two 2-D bar codes 410 which are of equal size (that is, equal horizontal numbers of cells and equal vertical numbers of cells). The method 500 may also be applied to a first 2-D barcode 410 of a first size and a second 2-D barcode 410 of a different second size, as illustrated in
Cell-by-Cell and Pixel-by-Pixel:
It will be noted that 2-D barcode merging has been described above as being implemented on a cell-by-cell basis. Persons skilled in the relevant arts will appreciate that, in an alternative embodiment, the color combination algorithms of the present system and method may be implemented instead on an image-pixel by image-pixel basis.
Exemplary Code:
Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to merge plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 into a merged, 2-D barcode image 510:
Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other color combinations may be employed within the scope and spirit of the present system and method.
The resulting, merged, 2-D barcode 510 encodes, in full, both the original plaintext transaction data 240 and the encrypted transaction data 425. Merged image 510 can be transmitted to either party to the transaction (for example seller or purchaser) as a graphical image. As will be discussed further below, merged image 510, and the transaction data which it contains, can be readily validated at the receiving end to establish whether the received data is valid or if instead the received data has been corrupted in any way.
Paper-Free Bill or “e-Bill”:
In an embodiment of the present system and method, the merged image 510 may constitute a final, complete paper-free bill, which encodes all transaction data 240 in an image. In an alternative embodiment (discussed further below, see
Transaction Affirmation Via Digitized Biometric Signatures
Conventionally, business transactions such as sales transactions are affirmed, witnessed, or validated by one or both parties to the transaction. For printed documents, a commonly used form of affirmation is the written signature of one or both parties.
In embodiments of the present system, the electronic transactions document(s) created, recorded, and transmitted may be affirmed by biometric signatures 605 which are graphical (that is, image-oriented) in nature. In various embodiments, possible biometric signatures 605 may include, for example, and without limitation: fingerprint images, palm-print images, iris scans, retinal scans, and even facial photos.
In an embodiment of the present system and method, the biometric signature 605, such as a fingerprint 605, is in two colors only, or “monochrome”, the two colors often being (as is conventional in the art) black and white, or two other colors with strong relative contrast. In an embodiment, the biometric signatures 605, such as a fingerprint 605 or handprint, or even an iris print, retinal scan, or facial photo, may originally be captured in grayscale or multiple colors. The grayscale or color image may be reduced to a two-color monochrome, such as black and white pixels, via known image processing methods, while retaining image quality and retaining essential image features and image data for purposes of biometric identification of persons.
In an embodiment of the present system and method, a biometric signature of both parties to the transaction is captured via an imaging device or scanner, such as biometric sensor 160 or camera 170 of tablet device 100. The biometric images 605 can be concatenated to the merged 2-D barcode image 510 with the document data, as described further below.
In an embodiment of the present system and method, the graphical biometric signatures 605 may be scrambled or encoded in a variety of ways prior to concatenation and digital transmission. The scrambling or encoding help ensure that, if the digital transaction document (e-bill) is improperly intercepted or otherwise obtained by third-parties (not parties to the transaction), the biometric signatures 605 cannot be recognized or interpreted. In this way, only legitimate signatory parties to the financial transaction can read and validate the biometric signatures 605.
Fingerprint Shuffling:
Once captured, fingerprint 605 is spatially divided into subparts 607 via image processing software running on processor 105 or other firmware of tablet 100. In the exemplary embodiment shown, fingerprint 605 is spatially divided into sixteen (16) separate sub-parts 607 of equal width/height dimensions; other numbers of sub-parts 607, greater or lesser, may be employed as well.
As discussed above (see
The fingerprint shuffling sequence 365 maps the original image from its original spatial ordering to a new spatial ordering. In an exemplary embodiment, the subparts 607 may be numbered sequentially, starting with number ‘1’ in an upper-left corner, and incrementing by one (1) reading from left-to-right and top-row to bottom-row as shown. Each sub-part 607 is then mapped from its original location to the formal location of another sub-part 607. The result is a shuffled fingerprint image 615.
Two Shuffled Fingerprints:
With reference now to
In an embodiment of the present system and method, two fingerprint images 605.1, 605.2 are each shuffled into two respective shuffled fingerprint images 615.1, 615.2. In an embodiment of the present system and method, both of fingerprint images 605.1, 605.2 are shuffled according to a common or same fingerprint shuffling sequence 365. In an alternative embodiment, the present system and method may generate two different fingerprint shuffling sequences 365.1, 365.2; and the present system and method may then shuffle each of fingerprint images 605.1, 605.2 according to the respective first and second fingerprint shuffling sequences 365.1, 365.2.
Merging/Combining the Two Shuffled Fingerprints into One Image:
In an embodiment of the present system and method—and similar to the manner in which two black-and-white 2-D barcodes 410 may be merged into a single, multi-colored 2-D barcode 510 (see
The merged fingerprint image 712 is generated in such a way that a third-party who is viewing or parsing the merged fingerprint 712 would generally not be able to extract the two original shuffled fingerprints 615.1, 615.2. This is because a third-party would require access to a proprietary algorithm to reconstruct the two original shuffled fingerprints 615.1, 615.2. At the same time, with access to the appropriate algorithm, the two initial shuffled fingerprints 615.1, 615.2 can later be recovered from the merged fingerprint image 712.
In an embodiment, shuffled fingerprint 605.1 and shuffled fingerprint 605.2 are graphically merged by combining each corresponding pixel from the two source shuffled fingerprints 605, and generating a designated color pixel 715 for different combinations of source pixels. “Corresponding pixels” are pixels which share a same coordinate position, that is, both pixels are in row X, column Y of respective shuffled fingerprints 615.1 and 615.2. This merging process is illustrated diagrammatically in
Understood another way, shuffled fingerprint 615.1 may be overlapped directly on top of shuffled fingerprint 615.2 (or vice versa). The overlap of any two source pixels generates a merged pixel combination 715 which may be assigned a specific color.
In an exemplary embodiment illustrated in
The above combinations, as illustrated in
In an alternative embodiment, a designated pixel combination could be randomly mapped to any of several colors of merged pixels. For example, a merging 715.3 of a white pixel over a black pixel could be mapped randomly to any of Red, Yellow, or Brown. For another example, a merging 715.4 of a black pixel over a white pixel could be mapped randomly to any of Blue, Orange, or Violet. For another example, two overlapped white pixels or two overlapped black pixels could be mapped to colors other than black or white. For third-parties who do not know the proprietary mapping algorithm, the use of additional possible output colors may further confound any efforts to “unmerge” or separate the merged image 712 into the original two shuffled fingerprints 615.1, 615.2.
Code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) may be employed to merge shuffled fingerprints 605.1, 605.2 into a merged, multi-colored shuffled fingerprint 712. Such code may be the same or substantially similar to that discussed above in conjunction with
The resulting merged, multi-colored shuffled fingerprint image 712 combines in full both of the original shuffled fingerprints 615.1, 615.2. Merged image 712 can be transmitted to either party to the transaction (for example seller or purchaser) as a graphical image. As will be discussed further below, merged image 712 can be readily validated at the receiving end to confirm that the fingerprints 605.1, 605.2, which were generally obtained at the point-of-transaction, match expected fingerprints (such as fingerprints stored in a suitable database [such as an employee database, customer database, etc.] or fingerprints obtained live from DSD personnel, in real-time, at the place and time of fingerprint validation).
Final Transaction Document Image
In an embodiment of the present system and method, a final transaction document image 805 is formed by combining or concatenating the merged 2-D barcode image 510 (see
In an embodiment, the final transaction document image 805 is composed by spatially placing the two source images (merged 2-D barcode image 510 and merged shuffled fingerprint image 712) side-by-side, or one spatially above or spatially below the other; or, put another way, by placing the two source images 510, 712 spatially adjacent to each other. In an embodiment, a border or borders 807 of a designated color or shading may be placed around either or both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712. Border 807 may help distinguish the two separate images within the overall final transaction document 805. Border 807 may also help ensure that the final transaction document 805 has a conventional rectangular image shape.
In alternative embodiments, other two-dimensional spatial arrangements may be made between merged 2-D barcode image 510 and merged shuffled fingerprint image 712 to compose the overall final transaction document image 805. In an embodiment, the sizes (pixel dimensions) of merged 2-D barcode image 510 and shuffled fingerprint image 712 can be adjusted in final transaction document image 805 for easy transfer and decoding.
In an alternative embodiment (not illustrated), additional image modifications may be made to final transaction document image 805. These may entail for example applying one or more data-lossless image deformations, such as skewing, rippling, or twirling, to FTD image 805. At a receiving end, and for purposes of data recovery, the present system and method would then entail first applying a suitable reverse deformation (deskew, unripple, untwirl, etc.), with suitable inverse deformation parameters, to the received FTD image 805.
In an embodiment, within final transaction document image 805, each and both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 are so arranged that the two source images 510, 712, when concatenated, may still be readily distinguished from each other; and equally, so that each source image 510, 712 is not obscured by the other. This ensures that the data contained in each of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 is not obscured by the other image. It further ensures that, on the receiving end, image processing software can readily extract and separate both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712.
In an alternative embodiment, within final transaction document image 805, each and both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 may be so arranged that the two source images 510, 712 partially or wholly overlap, with for example still further, additional pixel color mappings for overlapped pixels. In such alternative embodiments, additional algorithms on the receiving end would separate the two source images 510, 712, according to suitable algorithms to distinguish overlapping pixels.
Paper-Free Bill or “e-Bill”:
In an embodiment of the present system and method, the final transaction document image 805, which encodes all transaction data 240 in an image and also includes a biometric signature image 712, constitutes a “paper-free bill”. This may also be referred to equivalently as an “e-bill”, or similarly a “paper-free-” or “e-” invoice, receipt, transaction document, etc., for example a paper-free invoice, an e-invoice, an e-receipt, or by similar transaction document terms as applicable.
Image Data Transfer: Local Transfer, and Transfer from Point-of-Transaction to Remote Offices
It will be noted that, at the point-of-sale, where final transaction document (FTD) image 805 is created, the FTD image 805 may be transferred between transaction parties via local wired or wireless communications. For example, if FTD image 805 is created on tablet 100 (for example, by the seller), FTD image 805 may then be transferred to the buyer's tablet or to a cellular phone via a wireless connection such as Bluetooth or a local WiFi network. FTD image 805 may also be transferred from the seller to the buyer's cellular phone or tablet via a wired connection such as USB or Ethernet.
With reference now to
Direct Transmission:
In an embodiment, FTD image 805 may be transmitted by a first transmission process 915.1, for example via conventional e-mail, via FTP (file transfer protocol), cellular network transmission, or similar, from tablet 100 to the receiving parties. FTD image 805 may be sent in any known conventional document format, such as JPG, PNG, TFF, BMP, GIF, and others well known in the art, provided the image format permits sufficiently clear image resolution and detail for data recovery at the receiving end.
The received FTD image 940 is obtained by the receiving party. The received image 940 is processed via a decode and validation process 950 (discussed further below) to yield the original transaction document values 240 and fingerprints 605.
Indirect Transmission Via Image Capture:
In an alternative embodiment of the present system and method, FTD image 805 may first be captured in the field by another imaging device, such as a cell phone 905. For example, one of the parties to the point-of-service transaction may employ a cell phone 905 to capture the image of FTD image 805 as shown directly on a display screen 120 of tablet 100. Because cell phone imaging is subject to manual uncertainties (such as imperfect orientation of the cell phone 905 by the user, or hand motion of the cell phone 905 by the user), the captured image 910 may be subject to imperfections. These imperfections may include skewing of the image, partial rotation of the image, slight blurring, or other imperfections.
Captured FTD image 805 may be transmitted by a second transmission process 915.2 (for example, via the cellular network methods employed by a suitable cell phone service provider) from cell phone 905 to the receiving parties. Captured FTD image 805 may be sent in any known conventional document format, such as JPG, PNG, TFF, BMP, GIF, and others well known in the art.
The received FTD image 940 is obtained by the receiving party. The received, captured FTD image 945 will have any imperfections caused by the image capture process on cell phone 905.
In an embodiment of the present system and method, on the receiving end, received capture FTD image 945 may be subject to image-correction processes 947, such as deskewing, rotation, image sharpening, and other appropriate image-processing methods generally known in the art. The result is a corrected received image 940 which is the same, or substantially the same, as the final transaction document image 805.
The corrected received image 940 is further processed via a decode and validation process 950 (discussed further below) to yield the original transaction document values 240 and fingerprints 605.
Method for Transaction Document Encoding into Verifiable Image Format
The exemplary method 1000 may entail some method steps which may be the same or similar to many method steps previously discussed above in this document (see
The exemplary method 1000 may be performed via an electronic processing device, such as tablet 100, which is used to obtain, store, and process commercial transaction data, such as a sale and payment between two parties. The method 1000 may be performed after transaction data has been entered into, and stored, on tablet 100. The method 1000 may also be performed by a secondary, associated electronic processing device (which also has a processor, memory, and other elements similar to tablet 100), and which is communicatively coupled to tablet 100 to obtain data from tablet 100. For convenience of exposition only, the discussion below assumes method 1000 is performed via tablet 100.
Persons skilled in the relevant arts will recognize that the order of steps in method 1000 is exemplary only, and some steps shown later in method 1000 may be performed before steps which are described herein as earlier in the method 1000.
The method 1000 begins with step 1005. In step 1005, method 1000 extracts unique transaction data 240 and other distinctive transaction data 240 from a transaction document 200. (See
In step 1010, method 1000 generates a first two-dimensional (2-D) bar code 410.1, such as a matrix code or QR code. The encoding process 405 writes the plaintext transaction data 240 into a graphical form 410.1, typically a matrix or other arrangement of black-and-white cells or bars. (See
In step 1015, method 1000 generates a document-specific encryption password 360 which is based on, and is generally unique to, the unique transaction data 240 from document 100. The document-specific encryption password 360 (also referred to herein simply as “password 360”) may be generated according to any number of encryption methods 310 known in the art, or according to a proprietary encryption algorithm 310, as long as encryption algorithm 310 generates the password 360 based upon and unique to the unique transaction data 240. (See
In step 1020, method 1000 generates encrypted transaction data 425. Encrypted transaction data 425 is generated via a specified data encryption algorithm 420, and encrypts transaction data 240 by employing the document-specific password 360 discussed above (possibly along with additional passwords). The output of the step is encrypted, or “ciphertext”, transaction data 425. (See
In step 1025, method 1000 generates a second two-dimensional (2-D) bar code 410.2, such as a matrix code, QR code, or other arrangement of black-and-white cells or bars. This encodes 405 the ciphertext transaction data 425 into a graphical form 410.2. (See
In step 1030, method 1000 generates a merged 2-D barcode 510 which integrates the data from first barcode 410.1 and second barcode 410.2. Merged barcode 510 is created in such a way as to encode all the data from first barcode 410.1, which is the plaintext transaction data 420; and also all the data from second barcode 410.2, which is the ciphertext transaction data 425. (See
In an embodiment, merged barcode 510 is generated by overlapping corresponding cells of first barcode 410.1 and second barcode 410.2, and mapping various overlapped cell combinations to designated colors. Therefore, in an embodiment, merged barcode 510 is a multicolored barcode. A color mapping function, table, or algorithm may be implemented to perform such a mapping, mapping a first cell overlapping a second cell to a specific color. An exemplary code sample may be, for example and without limitation:
where ‘m’ and ‘n’ are row and column values; while ‘a’ and ‘b’ may take values of ‘0’ or ‘1’ for “Black” and “White” respectively. The plus (+) operator indicates overlapping the first cell over the second cell; and a table FinalCellColor(a, b) may be defined with values such as Green, Yellow, Red, Blue depending on the specific values (0 or 1) of a and b. For example, FinalCellColor (0, 0) may be defined as “Yellow”, and so maps a black-on-black cell combination to the color yellow for a single combined cell.
Other similar mappings, including mappings which allow for inclusion of alternative or additional colors, may be envisioned as well. (See again
In an embodiment of the present system and method, method 1000 stops after completion of step 1030, the method having generated the multi-colored merged barcode 510, which may also serve as the Final Transaction Document (FTD) image 805.
In an alternative embodiment, transaction document 200 includes, or has associated with it, graphical biometric signatures 605 which represent signatures of the human parties to the transaction. Graphical biometric signatures may be fingerprints, hand prints, iris scans, retinal scans, facial images, or similar. In such embodiments, method 1000 may continue (after step 1030) with step 1035.
In step 1035, exemplary method 1000 generates a unique, document-specific image shuffling sequence 365 based on the unique transaction data 240. Image shuffling sequence 365 maps a set of digits, typically starting at 1 and incrementing by 1, back onto itself, but in a different order. The order mapping is an indicator of how sections 607 of a biometric image 605 may be spatially re-ordered to create shuffled biometric image 615. The digits may be shuffled or re-ordered according to any of a variety of shuffling methods 350, as long as shuffling algorithm 350 generates the shuffling sequence 365 both based upon and unique to the unique transaction data 240. (See
In step 1040, method 1000 obtains the graphical biometric signatures 605 of the transaction parties, for example the fingerprints 605 of a buyer and a seller. The graphical biometric signatures 605 are typically images in such formats as GIF, JPG, BMP, and other graphical formats known in the art. (See
In step 1045, method 1000 divides each of the biometric images 605 (such as fingerprints 605) into multiple image segments 607, typically square or rectangular in shape. The image segments 607 are mutually non-overlapping, but together the image segments 607 may be spatially arranged in their original order of spatial relations and original orientations to reconstruct each biometric image 605. In an embodiment, the segments 607 are labeled/numbered with sequential digits, in order to be shuffled according to image shuffling sequence 365. (See
In step 1050, method 1000 spatially rearranges each of the first biometric image 605.1 and the second biometric image 605.2, to create respective, shuffled biometric images 615.1, 615.2. The shuffling is done according to the order indicated by the mapping in the document-specific image shuffling sequence 365. (See again
In step 1055, method 1000 generates a merged shuffled biometric signature 712. Merged biometric signature 712 is created in such a way as to encode substantially all the image data from first shuffled biometric image 615.1 and from second shuffled biometric image 615.2. (See
In an embodiment, merged shuffled biometric signature 712 is generated by overlapping corresponding image pixels of first shuffled biometric image 615.1 and second shuffled biometric image 615.2, and mapping various overlapped pixel combinations to designated colors. Therefore, in an embodiment, merged shuffled biometric signature 712 is a multicolored image. As with merged barcode 510 discussed above, a color mapping function, table, or algorithm may be implemented to perform such a mapping, mapping a first pixel overlapping a second pixel to a specific final color pixel. (See again
In step 1060, method 1000 forms a combined document image 805 which may be referred to as the Final Transaction Document (FTD) image 805, and which spatially concatenates merged 2-D barcode image 510 with merged shuffled biometric image 712. In an embodiment, the two images are concatenated by arranging them spatially side-by-side in one image. (See
Received Data: Data Retrieval and Validation
In an embodiment of the present system and method, the FTD image 805 is transmitted to receiving parties, which may for example be business offices associated with the point-of-transaction seller and/or buyer. (See
The transaction data encoded in the FTD image 805 can be extracted at the receiving end. In addition, data validation may be performed if there are any controversies between sender and receiver.
In step 1105 the method 1100 extracts from FTD image 805 both of:
In an embodiment, extraction of the two images 510, 712 involves a straightforward spatial parsing of the two images, as the two images are placed spatially side-by-side and are non-overlapping in FTD image 805.
In step 1110, method 1100 extracts, from merged 2-D barcode 510, each of the first 2-D barcode 410.1 with the plaintext data 240, and also the second 2-D barcode 410.2 with the ciphertext data 425. In an embodiment, extracting the two barcodes 410 is performed by reversing the method 500 described above in association with
Consider for example any cell of merged 2-D barcode 510 at a cell coordinate (X, Y). The cell 510 will have a specific color. The corresponding cells at (X, Y) coordinates of first 2-D barcode 410.1 and second 2-D barcode 410.2 are restored by determining the overlapped cell combinations 515 which resulted in the output color.
Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to distinguish plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 from a merged, 2-D barcode image 510:
Actual_Invoice_QRCode(x,y).PixelColor=“WHITE” AND Encrypted_Invoice_QRCode(x,y).PixelColor=“BLACK”;
Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other color combinations may be employed within the scope and spirit of the present system and method.
In step 1115, method 1100 extracts a first set of plaintext unique transaction document data (labeled herein as 240.1, though not specifically shown in
In step 1120, method 1100 generates an encryption password 360. The encryption password 360 is generated from the first set of unique plaintext transaction data 240.1, and is the same encryption password 360 as generated originally to create encrypted data 425 at the point-of-sale. (See
In step 1125, method 1100 extracts the ciphertext transaction document data 425 from the recovered second 2-D barcode 410.2. Ciphertext data 425 is recovered by extracting data from second 2-D barcode 410.2 according to a suitable barcode data extraction method known in the art for the specific 2-D barcode technology employed.
In step 1130, method 1100 decrypts ciphertext data 425 by applying the encryption password 360 (which was generated in step 1120) to ciphertext data 425 according to a suitable decryption algorithm; the suitable decryption algorithm is one designed to decrypt ciphertext generated in step 1020 of method 1000 above via encryption algorithm 420. (See also
Summarizing to this point, a first set of plaintext unique transaction document data 240.1 has been extracted from first 2-D barcode 410.1, while a second set of plaintext unique transaction document data 240.2 has been recovered from second 2-D barcode 410.2
In step 1135, method 1100 compares first set of plaintext transaction document data 240.1 with second set of plaintext transaction document data 240.2. If document and data integrity has been maintained, the two sets of plaintext data 240.1, 240.2 should be the same.
If the comparison of step 1135 determines that first set of plaintext transaction document data 240.1 and second set of plaintext transaction document data 240.2 are not the same, then in step 1140.1 method 1100 determines that the retrieved data is invalid or corrupted, or that in some other way data integrity has not been maintained. The method may stop at this point. In an alternative embodiment (not illustrated in the flow chart for
If the comparison of step 1135 determines that first set of plaintext transaction document data 240.1 and second set of plaintext transaction document data 240.2 are the same, then in step 1140.1 method 1100 determines that the retrieved data is valid, that is, that data integrity has been maintained.
In an alternative embodiment (not shown in
In an embodiment of the present system and method, method 1100 may stop at this point. In an alternative embodiment, method 1100 continues with step 1145.
In step 1145, method 1100 generates the unique image shuffling sequence 365 from the first set of plaintext unique transaction document data 240.1. The unique image shuffling sequence 365 is the same as that generated originally to create shuffled biometric images 615 at the point-of-sale. (See
In step 1150, method 1100 generates an inverse image shuffling sequence, which is the inverse of image shuffling sequence 365. In an embodiment, the inverse image shuffling sequence is simply a reverse of the mapping of image shuffling sequence 365, and can be used to restore a shuffled biometric image 615 to the corresponding original biometric image 605 (such as an original, unshuffled fingerprint image 605).
In step 1155, method 1100 extracts, from merged shuffled biometric image signature 712, the first shuffled biometric image 615.1 and the second shuffled biometric image 615.2. These images may for example be the shuffled images of the transaction-participant fingerprints 605 originally obtained at the point of service.
In an embodiment, extracting the two shuffled biometric image 615 is performed by reversing the method described above in association with
Consider for example any pixel of merged 2-D barcode 712 at a cell coordinate (X, Y). The pixel 712 will have a specific color. The corresponding pixels at (X, Y) coordinates of first shuffled biometric image 615.1 and second shuffled biometric image 615.2 are restored by determining the overlapped pixel combinations 715 which resulted in the output (merged) pixel color.
Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to extract first shuffled biometric image 615.1 and second shuffled biometric image 615.2 from a merged shuffled biometric image 712:
Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other pixel color combinations may be employed within the scope and spirit of the present system and method.
In step 1160, method 1100 reverse-shuffles each of the first shuffled biometric image 615.1 and the second shuffled biometric image 615.2 (from step 1155), both according to the inverse image shuffling sequence generated in step 1150. The result is a respective 1st unshuffled biometric image signature 615.1 (for example, an image recognizable as a first fingerprint) and a 2nd unshuffled biometric image signature 615.2 (for example, an image recognizable as a second fingerprint).
In step 1165, method 1100 obtains, from a suitable employee or personnel database, a stored biometric image (such as a fingerprint) which is expected for one or both parties to the sales transaction. In an alternative embodiment, step 1165 may obtain one or both validation fingerprint images of expected human parties to the transaction via live, real-time scanning of their fingerprints with a local or remote fingerprint scanner. The expected fingerprints, of the persons who should be the signatories to the transaction, may be referred to herein as the “validation biometric signatures” or equivalently as the “stored/live biometric signatures”. (“Image” may also be used in place of “signature”.)
In step 1170, the method 1100 may compare the first unshuffled biometric image 615.1 with a stored/live biometric image for a first party to the sales transaction. If the two images compare favorably (that is, are determined to be biometric images of the same person, such as fingerprints of the same person), then in step 1175.1 the first signature on the electronic document is considered to be verified as valid. If the two images do not compare favorably (that is, are determined to not be biometric images of the same person, or may not be biometric images of the same person), then in step 1175.2 the first signature on the electronic document is considered to be not verified or invalid.
Similarly, and also in step 1170, the method 1100 may compare the second unshuffled biometric image 615.2 with a stored/live biometric image for a second party to the sales transaction. If the two images compare favorably (are determined to be biometric images of the same person, such as fingerprints of the same person), then in step 1175.1 the second signature on the electronic document is considered to be verified as valid. If the two images do not compare favorably (that is, are determined to not be biometric images of the same person, or may not be biometric images of the same person), then in step 1175.2 the second signature on the electronic document is considered to be not verified or invalid.
Note: In an embodiment, for fingerprint comparisons and validation, the present system and method may employ proprietary code. In an alternative embodiment, the present system and method may also employ open source code for fingerprint comparisons. See for example:
The present system and method supports the collection, recording, encoding and encryption, transmission, and validation of commercial transaction data (such as bills of sales and sales receipts) at a point-of-sale and at other locations. These tasks are accomplished entirely electronically/digitally, via an electronic processing device such as a tablet 100, without the use of paper forms. However, the present system and method still retains many of the benefits as paper forms, along with additional advantages.
1. Paper Bills Cannot be Tampered with Easily.
The present system and method identifies if any of the files have been tampered with or modified by extracting the actual (plaintext) data 2-D barcode 410.1 and the encrypted (ciphertext) data 2-D barcode 410.2. If the encrypted data 2-D barcode 410.2 was not decrypted properly or not matched with the actual transaction data 240, then the present system and method may consider that the data files have been tampered with, modified, or corrupted.
2. Easily Detectable Forgery Signatures.
Hand-written signatures may be easily forged, but fingerprints 605 and other graphical biometric signatures 605 cannot be readily forged. Further, by virtue of the fingerprint shuffle sequence 365 and the merging of fingerprint images into a combined shuffled fingerprint image 712, the original fingerprint image can only be reconstructed by parties in possession of the proprietary shuffling and merging algorithms employed.
Other Features
In various embodiments, the present system and method offers these features:
The present system and method may be implemented in any workflow applications as a licensed feature, and so provide the present system and method as integrated workflow functionality. The present system and method may be offered as a complete functionality as a software service or in a software development kit (SDK) to incorporate the present system and method in business-to-business (B2B) customer applications.
In various embodiments, the present system and method may be employed for Direct Store Delivery (DSD) applications and also for Less than Truck Load (LTL) applications to completely avoid paper bills.
In various embodiments, the present system and method may include separate software modules or systems to create and/or read paper free bills.
In embodiments of the present system and method, potential benefits and cost-savings compared to paper billing may include:
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
In the description above, a flow charted technique may be described in a series of sequential actions. Unless expressly stated to the contrary, the sequence of the actions and the party performing the actions may be freely changed without departing from the scope of the teachings. Actions may be added, deleted, or altered in several ways. Similarly, the actions may be re-ordered or looped (repeated). Further, although processes, methods, algorithms or the like may be described in a sequential order, such processes, methods, algorithms, or any combination thereof may be operable to be performed in alternative orders. Further, some actions within a process, method, or algorithm may be performed simultaneously during at least a point in time (e.g., actions performed in parallel), can also be performed in whole, in part, or any combination thereof.
Further, in some embodiments, certain method decision steps or branching points discussed herein may be eliminated within the scope and spirit of the present system and method; still further; additional options, alternative outcomes, or entire additional decision or branching points may be added within the scope and spirit of the present system and method.
As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and 8 are true (or present).
Number | Date | Country | Kind |
---|---|---|---|
201711020273 | Jun 2017 | IN | national |
Number | Name | Date | Kind |
---|---|---|---|
6832725 | Gardiner et al. | Dec 2004 | B2 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
7783893 | Gorelik et al. | Aug 2010 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8763909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber et al. | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9061527 | Tobin et al. | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9076459 | Braho et al. | Jul 2015 | B2 |
9079423 | Bouverie et al. | Jul 2015 | B2 |
9080856 | Laffargue | Jul 2015 | B2 |
9082023 | Feng et al. | Jul 2015 | B2 |
9084032 | Rautiola et al. | Jul 2015 | B2 |
9087250 | Coyle | Jul 2015 | B2 |
9092681 | Havens et al. | Jul 2015 | B2 |
9092682 | Wilz et al. | Jul 2015 | B2 |
9092683 | Koziol et al. | Jul 2015 | B2 |
9093141 | Liu | Jul 2015 | B2 |
9098763 | Lu et al. | Aug 2015 | B2 |
9104929 | Todeschini | Aug 2015 | B2 |
9104934 | Li et al. | Aug 2015 | B2 |
9107484 | Chaney | Aug 2015 | B2 |
9111159 | Liu et al. | Aug 2015 | B2 |
9111166 | Cunningham | Aug 2015 | B2 |
9135483 | Liu et al. | Sep 2015 | B2 |
9137009 | Gardiner | Sep 2015 | B1 |
9141839 | Xian et al. | Sep 2015 | B2 |
9147096 | Wang | Sep 2015 | B2 |
9148474 | Skvoretz | Sep 2015 | B2 |
9158000 | Sauerwein | Oct 2015 | B2 |
9158340 | Reed et al. | Oct 2015 | B2 |
9158953 | Gillet et al. | Oct 2015 | B2 |
9159059 | Daddabbo et al. | Oct 2015 | B2 |
9165174 | Huck | Oct 2015 | B2 |
9171543 | Emerick et al. | Oct 2015 | B2 |
9183425 | Wang | Nov 2015 | B2 |
9189669 | Zhu et al. | Nov 2015 | B2 |
9195844 | Todeschini et al. | Nov 2015 | B2 |
9202458 | Braho et al. | Dec 2015 | B2 |
9208366 | Liu | Dec 2015 | B2 |
9208367 | Wang | Dec 2015 | B2 |
9219836 | Bouverie et al. | Dec 2015 | B2 |
9224022 | Ackley et al. | Dec 2015 | B2 |
9224024 | Bremer et al. | Dec 2015 | B2 |
9224027 | Van Horn et al. | Dec 2015 | B2 |
D747321 | London et al. | Jan 2016 | S |
9230140 | Ackley | Jan 2016 | B1 |
9235553 | Fitch et al. | Jan 2016 | B2 |
9239950 | Fletcher | Jan 2016 | B2 |
9245492 | Ackley et al. | Jan 2016 | B2 |
9443123 | Hejl | Jan 2016 | B2 |
9248640 | Heng | Feb 2016 | B2 |
9250652 | London et al. | Feb 2016 | B2 |
9250712 | Todeschini | Feb 2016 | B1 |
9251411 | Todeschini | Feb 2016 | B2 |
9258033 | Showering | Feb 2016 | B2 |
9262633 | Todeschini et al. | Feb 2016 | B1 |
9262660 | Lu et al. | Feb 2016 | B2 |
9262662 | Chen et al. | Feb 2016 | B2 |
9269036 | Bremer | Feb 2016 | B2 |
9270782 | Hala et al. | Feb 2016 | B2 |
9274812 | Doren et al. | Mar 2016 | B2 |
9275388 | Havens et al. | Mar 2016 | B2 |
9277668 | Feng et al. | Mar 2016 | B2 |
9280693 | Feng et al. | Mar 2016 | B2 |
9286496 | Smith | Mar 2016 | B2 |
9297900 | Jiang | Mar 2016 | B2 |
9298964 | Li et al. | Mar 2016 | B2 |
9301427 | Feng et al. | Mar 2016 | B2 |
9304376 | Anderson | Apr 2016 | B2 |
9310609 | Rueblinger et al. | Apr 2016 | B2 |
9313377 | Todeschini et al. | Apr 2016 | B2 |
9317037 | Byford et al. | Apr 2016 | B2 |
D757009 | Oberpriller et al. | May 2016 | S |
9342723 | Liu et al. | May 2016 | B2 |
9342724 | McCloskey et al. | May 2016 | B2 |
9360304 | Xue et al. | Jun 2016 | B2 |
9361882 | Ressler et al. | Jun 2016 | B2 |
9365381 | Colonel et al. | Jun 2016 | B2 |
9373018 | Colavito et al. | Jun 2016 | B2 |
9375945 | Bowles | Jun 2016 | B1 |
9378403 | Wang et al. | Jun 2016 | B2 |
D760719 | Zhou et al. | Jul 2016 | S |
9383848 | Daghigh | Jul 2016 | B2 |
9384374 | Bianconi | Jul 2016 | B2 |
9390596 | Todeschini | Jul 2016 | B1 |
D762604 | Fitch et al. | Aug 2016 | S |
9411386 | Sauerwein | Aug 2016 | B2 |
9412242 | Van Horn et al. | Aug 2016 | B2 |
9418269 | Havens et al. | Aug 2016 | B2 |
9418270 | Van Volkinburg et al. | Aug 2016 | B2 |
9423318 | Lui et al. | Aug 2016 | B2 |
D766244 | Zhou et al. | Sep 2016 | S |
9443222 | Singel et al. | Sep 2016 | B2 |
9454689 | McCloskey et al. | Sep 2016 | B2 |
9464885 | Lloyd et al. | Oct 2016 | B2 |
9465967 | Xian et al. | Oct 2016 | B2 |
9478113 | Xie et al. | Oct 2016 | B2 |
9478983 | Kather et al. | Oct 2016 | B2 |
D771631 | Fitch et al. | Nov 2016 | S |
9481186 | Bouverie et al. | Nov 2016 | B2 |
9488986 | Solanki | Nov 2016 | B1 |
9489782 | Payne et al. | Nov 2016 | B2 |
9490540 | Davies et al. | Nov 2016 | B1 |
9491729 | Rautiola et al. | Nov 2016 | B2 |
9497092 | Gomez et al. | Nov 2016 | B2 |
9507974 | Todeschini | Nov 2016 | B1 |
9519814 | Cudzilo | Dec 2016 | B2 |
9521331 | Bessettes et al. | Dec 2016 | B2 |
9530038 | Xian et al. | Dec 2016 | B2 |
D777166 | Bidwell et al. | Jan 2017 | S |
9558386 | Yeakley | Jan 2017 | B2 |
9572901 | Todeschini | Feb 2017 | B2 |
9606581 | Howe et al. | Mar 2017 | B1 |
D783601 | Schulte et al. | Apr 2017 | S |
D785617 | Bidwell et al. | May 2017 | S |
D785636 | Oberpriller et al. | May 2017 | S |
9646189 | Lu et al. | May 2017 | B2 |
9646191 | Unemyr et al. | May 2017 | B2 |
9652648 | Ackley et al. | May 2017 | B2 |
9652653 | Todeschini et al. | May 2017 | B2 |
9656487 | Ho et al. | May 2017 | B2 |
9659198 | Giordano et al. | May 2017 | B2 |
D790505 | Vargo et al. | Jun 2017 | S |
D790546 | Zhou et al. | Jun 2017 | S |
9680282 | Hanenburg | Jun 2017 | B2 |
9697401 | Feng et al. | Jul 2017 | B2 |
9701140 | Alaganchetty et al. | Jul 2017 | B1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedrao | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130332524 | Fiala et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140100813 | Showering | Jan 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Lumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150220753 | Zhu et al. | Aug 2015 | A1 |
20150254485 | Feng et al. | Sep 2015 | A1 |
20150310243 | Ackley et al. | Oct 2015 | A1 |
20150310389 | Crimm et al. | Oct 2015 | A1 |
20150327012 | Bian et al. | Nov 2015 | A1 |
20160014251 | Hejl | Jan 2016 | A1 |
20160040982 | Li et al. | Feb 2016 | A1 |
20160042241 | Todeschini | Feb 2016 | A1 |
20160055552 | Chai et al. | Feb 2016 | A1 |
20160057230 | Todeschini et al. | Feb 2016 | A1 |
20160062473 | Bouchat et al. | Mar 2016 | A1 |
20160092805 | Geisler et al. | Mar 2016 | A1 |
20160101936 | Chamberlin | Apr 2016 | A1 |
20160102975 | McCloskey et al. | Apr 2016 | A1 |
20160104019 | Todeschini et al. | Apr 2016 | A1 |
20160104274 | Jovanovski et al. | Apr 2016 | A1 |
20160109219 | Ackley et al. | Apr 2016 | A1 |
20160109220 | Laffargue | Apr 2016 | A1 |
20160109224 | Thuries et al. | Apr 2016 | A1 |
20160112631 | Ackley et al. | Apr 2016 | A1 |
20160112643 | Laffargue et al. | Apr 2016 | A1 |
20160117627 | Raj et al. | Apr 2016 | A1 |
20160124516 | Schoon et al. | May 2016 | A1 |
20160125217 | Todeschini | May 2016 | A1 |
20160125342 | Miller et al. | May 2016 | A1 |
20160133253 | Braho et al. | May 2016 | A1 |
20160171597 | Todeschini | Jun 2016 | A1 |
20160171666 | McCloskey | Jun 2016 | A1 |
20160171720 | Todeschini | Jun 2016 | A1 |
20160171775 | Todeschini et al. | Jun 2016 | A1 |
20160171777 | Todeschini et al. | Jun 2016 | A1 |
20160174674 | Oberpriller et al. | Jun 2016 | A1 |
20160178479 | Goldsmith | Jun 2016 | A1 |
20160178685 | Young et al. | Jun 2016 | A1 |
20160178707 | Young et al. | Jun 2016 | A1 |
20160179132 | Harr et al. | Jun 2016 | A1 |
20160179143 | Bidwell et al. | Jun 2016 | A1 |
20160179368 | Roeder | Jun 2016 | A1 |
20160179378 | Kent et al. | Jun 2016 | A1 |
20160180130 | Bremer | Jun 2016 | A1 |
20160180133 | Oberpriller et al. | Jun 2016 | A1 |
20160180136 | Meier et al. | Jun 2016 | A1 |
20160180594 | Todeschini | Jun 2016 | A1 |
20160180663 | McMahan et al. | Jun 2016 | A1 |
20160180678 | Ackley et al. | Jun 2016 | A1 |
20160180713 | Bernhardt et al. | Jun 2016 | A1 |
20160185136 | Ng et al. | Jun 2016 | A1 |
20160185291 | Chamberlin | Jun 2016 | A1 |
20160186926 | Oberpriller et al. | Jun 2016 | A1 |
20160188861 | Todeschini | Jun 2016 | A1 |
20160188939 | Sailors et al. | Jun 2016 | A1 |
20160188940 | Lu et al. | Jun 2016 | A1 |
20160188941 | Todeschini et al. | Jun 2016 | A1 |
20160188942 | Good et al. | Jun 2016 | A1 |
20160188943 | Linwood | Jun 2016 | A1 |
20160188944 | Wilz et al. | Jun 2016 | A1 |
20160189076 | Mellott et al. | Jun 2016 | A1 |
20160189087 | Morton et al. | Jun 2016 | A1 |
20160189088 | Pecorari et al. | Jun 2016 | A1 |
20160189092 | George et al. | Jun 2016 | A1 |
20160189284 | Mellott et al. | Jun 2016 | A1 |
20160189288 | Todeschini | Jun 2016 | A1 |
20160189366 | Chamberlin et al. | Jun 2016 | A1 |
20160189443 | Smith | Jun 2016 | A1 |
20160189447 | Valenzuela | Jun 2016 | A1 |
20160189489 | Au et al. | Jun 2016 | A1 |
20160191684 | DiPiazza et al. | Jun 2016 | A1 |
20160192051 | DiPiazza et al. | Jun 2016 | A1 |
20160125873 | Braho et al. | Jul 2016 | A1 |
20160202951 | Pike et al. | Jul 2016 | A1 |
20160202958 | Zabel et al. | Jul 2016 | A1 |
20160202959 | Doubleday et al. | Jul 2016 | A1 |
20160203021 | Pike et al. | Jul 2016 | A1 |
20160203429 | Mellott et al. | Jul 2016 | A1 |
20160203797 | Pike et al. | Jul 2016 | A1 |
20160203820 | Zabel et al. | Jul 2016 | A1 |
20160204623 | Haggert et al. | Jul 2016 | A1 |
20160204636 | Allen et al. | Jul 2016 | A1 |
20160204638 | Miraglia et al. | Jul 2016 | A1 |
20160316190 | McCloskey et al. | Jul 2016 | A1 |
20160227912 | Oberpriller et al. | Aug 2016 | A1 |
20160232891 | Pecorari | Aug 2016 | A1 |
20160292477 | Bidwell | Oct 2016 | A1 |
20160294779 | Yeakley et al. | Oct 2016 | A1 |
20160306769 | Kohtz et al. | Oct 2016 | A1 |
20160314276 | Sewell et al. | Oct 2016 | A1 |
20160314294 | Kubler et al. | Oct 2016 | A1 |
20160323310 | Todeschini et al. | Nov 2016 | A1 |
20160325677 | Fitch et al. | Nov 2016 | A1 |
20160327614 | Young et al. | Nov 2016 | A1 |
20160327930 | Charpentier et al. | Nov 2016 | A1 |
20160328762 | Pape | Nov 2016 | A1 |
20160330218 | Hussey et al. | Nov 2016 | A1 |
20160343163 | Venkatesha et al. | Nov 2016 | A1 |
20160343176 | Ackley | Nov 2016 | A1 |
20160364914 | Todeschini | Dec 2016 | A1 |
20160370220 | Ackley et al. | Dec 2016 | A1 |
20160372282 | Bandringa | Dec 2016 | A1 |
20160373847 | Vargo et al. | Dec 2016 | A1 |
20160377414 | Thuries et al. | Dec 2016 | A1 |
20160377417 | Jovanovski et al. | Dec 2016 | A1 |
20170010141 | Ackley | Jan 2017 | A1 |
20170010328 | Mullen et al. | Jan 2017 | A1 |
20170010780 | Waldron et al. | Jan 2017 | A1 |
20170016714 | Laffargue et al. | Jan 2017 | A1 |
20170018094 | Todeschini | Jan 2017 | A1 |
20170046603 | Lee et al. | Feb 2017 | A1 |
20170047864 | Stang et al. | Feb 2017 | A1 |
20170053146 | Liu et al. | Feb 2017 | A1 |
20170053147 | Germaine et al. | Feb 2017 | A1 |
20170053647 | Nichols et al. | Feb 2017 | A1 |
20170055606 | Xu et al. | Mar 2017 | A1 |
20170060316 | Larson | Mar 2017 | A1 |
20170061961 | Nichols et al. | Mar 2017 | A1 |
20170064634 | Van Horn et al. | Mar 2017 | A1 |
20170083730 | Feng et al. | Mar 2017 | A1 |
20170091502 | Furlong et al. | Mar 2017 | A1 |
20170091706 | Lloyd et al. | Mar 2017 | A1 |
20170091741 | Todeschini | Mar 2017 | A1 |
20170091904 | Ventress | Mar 2017 | A1 |
20170092908 | Chaney | Mar 2017 | A1 |
20170094238 | Germaine et al. | Mar 2017 | A1 |
20170098947 | Wolski | Apr 2017 | A1 |
20170100949 | Celinder et al. | Apr 2017 | A1 |
20170108838 | Todeschini et al. | Apr 2017 | A1 |
20170108895 | Chamberlin et al. | Apr 2017 | A1 |
20170118355 | Wong et al. | Apr 2017 | A1 |
20170123598 | Phan et al. | May 2017 | A1 |
20170124369 | Rueblinger et al. | May 2017 | A1 |
20170124396 | Todeschini et al. | May 2017 | A1 |
20170124687 | McCloskey et al. | May 2017 | A1 |
20170126873 | McGary et al. | May 2017 | A1 |
20170126904 | d'Armancourt et al. | May 2017 | A1 |
20170139012 | Smith | May 2017 | A1 |
20170140329 | Bernhardt et al. | May 2017 | A1 |
20170140731 | Smith | May 2017 | A1 |
20170147847 | Berggren et al. | May 2017 | A1 |
20170150124 | Thuries | May 2017 | A1 |
20170169198 | Nichols | Jun 2017 | A1 |
20170171035 | Lu et al. | Jun 2017 | A1 |
20170171703 | Maheswaranathan | Jun 2017 | A1 |
20170171803 | Maheswaranathan | Jun 2017 | A1 |
20170180359 | Wolski et al. | Jun 2017 | A1 |
20170180577 | Nguon et al. | Jun 2017 | A1 |
20170181299 | Shi et al. | Jun 2017 | A1 |
20170190192 | Delario et al. | Jul 2017 | A1 |
20170193432 | Bernhardt | Jul 2017 | A1 |
20170193461 | Jonas et al. | Jul 2017 | A1 |
20170193727 | Van Horn et al. | Jul 2017 | A1 |
20170200108 | Au et al. | Jul 2017 | A1 |
20170200275 | McCloskey et al. | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
2013163789 | Nov 2013 | WO |
Entry |
---|
Harish et al., “Secured QR-Code Based COD Payment Though Mobile Bill Presentment System Replacing the POS Machine With an Electronic Device”, International Journal of Advance Research in Science and Engineering, vol. No. 5, Issue No. 02, Feb. 2016, www.ijarse.com. |