Secure paper-free bills in workflow applications

Information

  • Patent Grant
  • 10332099
  • Patent Number
    10,332,099
  • Date Filed
    Tuesday, May 1, 2018
    6 years ago
  • Date Issued
    Tuesday, June 25, 2019
    5 years ago
Abstract
A system and method allows for paperless billing transaction document exchanges between two parties to a sales transaction, with inherent document verification. In an embodiment, the system and method first encrypts the transaction document data according to a password which itself depends on unique transaction data in a particular sales document. The system and method then employs the unique data values to create a first 2-D barcode which directly represents the document data; and to create a second 2-D barcode which represents the encrypted document data. The two matrix codes are overlapped on a cell-by-cell basis into a single visual representation, employing multiple colors for different combinations of overlapped cells. At a receiving end, the two original matrix codes can be separated by extracting the two black and white 2-D barcodes from the combined color code. The data integrity of the received data is confirmed by checking that the encrypted 2-D barcode is consistent with the plaintext 2-D barcode. Additional methods are employed to attach secure, merged biometric image signatures, such as merged fingerprint images, to the merged 2-D barcode form of the transaction document.
Description
FIELD OF THE INVENTION

The present invention relates to sales billing. More specifically, the present invention pertains to invoices, sales receipts, and related transaction documents for business sales; the sales transactions may be transacted person-to-person for deliveries at retailer locations, factory locations, shop locations, and similar. The invention further relates to a system and method to eliminate printed paper invoices and printed sales receipts by employing secure, validated transmission of a purely digital/electronic bill (e-bill) of sale and/or sales receipt.


BACKGROUND

Workflow Applications.


Workflow applications are software applications which automate some or all of the steps of a business process. Typically, most business processes have some steps which require human input, such as entry of specific text or selection of specific values. Other steps can be automated and so be handled by the software application. For example, once a user has selected a product, the software may automatically insert a price; or once a user has selected or typed in a purchaser, a previously recorded purchaser address may be automatically filled in.


Product ordering, finance, and billing/payment processes are typically good candidates for workflow applications. For example, a purchase order within a single company may require authorizations from multiple departments. Such a purchase order can be initiated by a requesting party within the company, and then be forwarded electronically, via software, through multiple departments within the company. The requester of the purchase order can give the authorization when all required departments have approved. If a workflow application is strictly internal to one company, the integrity and the validity of the data processed by the application can often be readily maintained by a single workflow application (or by multiple applications which are designed as an integrated “team” of software elements).


Sales and Billing


Small sales and service transactions are often done “in the field”, for example at a store where a product or service company (or a delivery service working on their behalf) delivers a product to a commercial enterprise or to a home. Such sales transactions typically involve hardcopy paperwork—for example, a bill of sales, a receipt, and/or electronic fund transfer authorizations (for example, credit cards or electronic checks). In such cases, both parties (seller and purchaser) typically need to exchange and retain valid copies of the pertinent documents.


These product/service sales and product/service billing situations provide another business context where workflow applications may be appropriate and helpful. A complication arises in such cases, however, because here the paperwork (which may be substantial in volume) is typically between different companies or organizations—typically between a purchaser and a seller. As a result, the paperwork—whether actual paper, or digital “paperwork”—is not internal to any one organization. Creating copies for each separate party, and establishing, maintaining, and validating the data integrity and security of paperwork exchanged between different companies, may then be a significant challenge.


Direct Store Delivery (DSD).


As noted above, paperwork exchanges may occur outside of an office context, adding a further challenge to maintaining data integrity and validation. Consider for example direct store delivery (DSD) applications. In deliveries from a distributor or product manager directly to a store (or home consumer or other off-business-site customer), purchase orders, bills, and payments may all be exchanged at a consumer's front door, at a loading dock, or in the lobby/reception area of an office or factory.


In DSD applications, the DSD driver provides the bill (invoice) of delivered material to the store keepers (or retailers, or home consumer). That bill amount will be paid instantly or later. Store keepers (or retailers) may also provide some kind of receipts to DSD drivers. These bills and receipts exchange happen in-the-field, at retailer location.


In all these transactions there is great deal of paper work involved, and typically both the parties exchange their bills physically. For customary record-keeping purposes, DSD suppliers (or DSD drivers) should maintain all these bills (invoices) and receipts for about ten to twenty years.


DSD as an Example of Workflow Applications: In this document, many examples, applications, and exemplary methods may be based on exemplary Direct Store Delivery (DSD) use cases or DSD contexts. It will be noted that DSD examples and use cases are employed for purposes of illustration. The present system and method may be applied broadly to many different applications and contexts where transactional documents are employed, agreement documents are employed, or more generally to many types of workflow paperwork. Similarly, throughout this document, references to DSD persons or contexts, such as “DSD drivers”, may be understood to refer more broadly to any personnel engaged in workflow activities.


Advantages of Paper Bills and Receipts (Hardcopy)


Still paper bills and receipts are used due to the following advantages:

    • Paper bills and receipts cannot be tampered with easily. If they are tampered with, that can generally be readily ascertained through visual inspection.
    • Easily detectable forgery signatures: Bills and receipts are generally validated through hand-written signatures. Legitimate hand signatures leave signature pressure lines on the papers. This can be used to ascertain and identify the forgery signatures.
    • Company Stamps—which are physically stamped onto the paper—are also considered as the authentication sign of the company(s) or parties involved in the transaction.
    • Hardcopy paperwork is a familiar and established way of conducting business.


Disadvantages of Paper Bills and Receipts (Hardcopy)


Paper bills, paper receipts, and other physical hardcopy have distinct, substantial disadvantages as well.

    • Daily bill submissions: DSD drivers (or any workflow staff-persons) need to collect all bills for each day from the other parties to the sales transactions, and the DSD drivers must submit the bills to a home office or central office. If even one bill is misplaced, that may lead to billing and record-keeping inconsistencies.
    • Bill preservation and searching of bills: Workflow organizations (for example, DSD suppliers) need to preserve the bills for an extended time. If there are many bills, then it is very difficult to maintain all the paper records and to search for specific bills.
    • Bills can not be duplicated or recovered: Old bills can not be duplicated with same kind of features as the original bill. If the bills are torn or otherwise in a non-usable state, then recovering the bills to the same state as the original bill may be difficult or impossible.


Electronic Scanning and Storage (Softcopy)


Pure usage of paper bills and paper receipts has the disadvantages noted immediately above. For this reason it has become common to have paper documents scanned and stored electronically. However, this solution presents problems as well.


Cost of scanners, printers, papers: In Workflow applications, such as direct sales delivery (DSD), both the parties to the transaction (bill distributor and bill receivers) should have the scanners and printers at the location where business is transacted, for example at individual stores. There is some cost involved in purchase of scanners, printers, toner, and the paper itself.


Carrying scanners, printers, and papers: In some of the workflow applications, the bill distributor (typically the seller or delivery service for a product or service) needs to bring along scanners, printers, and papers during business travel. Carrying and handling of these items (along with the actual goods for sale) adds an extra burden for the delivery or service personnel.


Not environmentally friendly: Scans originate from paper. Using paper is not good for the environment, since paper is made from trees.


Tampering: Scanned copies, such as Portable Document Format (PDF) copies of printed bills and receipts, may be easily altered. For example, a scanned copy could be altered before transmission from one location to another, with the recipient having no way to know the PDF file was altered. In some embodiments, the present system and method addresses this deficiency.


Bill searching: Scanned copies are typically image copies, and cannot be readily searched for text or numeric data. Text recognition may be performed on such documents, but this requires extra processing. Further, handwritten information, as may be written on a print document by persons engaged in a sales transaction, often does not lend itself to reliable text recognition.


Transaction Validation Challenges


Sales documents, bills, and receipts are typically validated by the parties involved in the transactions. This conventionally involves signatures, which may be done by pen-and-ink on paper, and which can then be scanned; validation is also done increasingly by signature by stylus on a contact sensitive electronic display screens/tablets. Even these approaches have disadvantages, especially for documents which must be transmitted from one party to another.


Signature validation: Signatures captured from paper by a scanner will not be same as actual signatures (since they lack the indentations made in paper by actual signature), and so may be less reliable for signature validation. Signatures which are originally captured electronically on tablets could easily be digitally “swapped” with false signatures by a malicious party.


Tampering: Softcopy, such as PDF scans of printed and hand-signed documents, can be tampered and altered easily (with the possible exception of cases where digital signatures are not used).


Digital Signatures and Digital Watermarks


Another method used for document validation is a digital signature, which is a mathematical process to demonstrate the authenticity of digital documents (for example, to authenticate PDF scans of print bills and receipts). Digital signatures can be validated mathematically by a recipient, which provides the recipient with a degree of confirmation the message was created by a known sender (authentication), that the sender actually sent the message (non-repudiation), and that the message was not altered in transit (integrity).


Here again, however, there are disadvantages with digital signatures.

    • Digital signature requires keys (specific sequences of digital symbols) from 3rd parties, which involves some cost.
    • If a printout of the finished document is made, the printed document does not retain the digital signature. This confines validated document use to electronic use only, when hardcopy may at times be preferred during some processing. If a picture of the transaction document was taken from other device (like mobile phone or camera), digital signatures will again not be retained.
    • If either a public or private key is changed, then the document sender (typically a seller or bill distributor) needs to maintain all the previous keys as well.
    • If the public and/or private key is changed, then validating the old signed documents becomes a difficult process.
    • Digital signatures validity is less than 10 years. If any document needs to be retained for more than 10 years, digital signature does not provide a security solution.
    • If, over time, a decision is made to change the encryption and/or decryption algorithms, then validating the old signed documents is again not an easy process, and new validation algorithms need to be adopted by both the sender and recipient(s) of billing documents and receipts.
    • There are problems with digital watermarks as well. For example, if the watermark is corrupted, that may corrupt the entire data file.


What is needed, then, is a system and method for recording financial transaction documents (such as sales bills and receipts) in the field at the point of sale/transaction, at the time of sale/transaction. The system and method should be entirely digital, avoiding paper altogether; and once the document is sent electronically, the system and method should provide for robust data validation at the receiving end. The system and method should also be practical, convenient and robust, employing digital validation processes which enable secure duplication and electronic transmission of the documents in a manner which is highly reliable, readily validated, and relies essentially only on data inherent in the transaction itself, without the need for third-party keys or validation algorithms (which can be modified for various reasons).


SUMMARY

Accordingly, in one aspect, the present invention embraces both portable hardware (tablet computers and/or cell phones, and similar) in conjunction with software, which completely replaces paper billing and receipts for DSD and for similar applications where paper or paperless documents are required to establish, confirm, or support some kind of mutual agreement between two parties. The mutual agreement may entail numeric data pertaining to finance or sales, but may also be any other kinds of agreements between two parties, and which has unique document data.


The system and method allows for paperless billing and paperless receipt document exchanges, with inherent document verification, between any two parties (such as seller and purchaser) who are at same location. The system and method enables the parties to exchange the transaction data through Bluetooth, USB or any wireless or wired communication. This system and method even works if the parties want to share data (that is, bills, receipts) by taking pictures of documents (for example, taking pictures via their cell phones).


In an embodiment, the system and method entails employing the unique data values which are inherent in a particular transaction document (such as data from a bill of sale) to create to visual representations such as two-dimensional (2-D) bar codes or matrix codes.


The document data is first encrypted to ciphertext according to a password which itself depends on the unique document data. A first 2-D barcode is then generated, which directly represents the document data, according to standard matrix coding algorithms. A second 2-D barcode represents the document data in the encrypted form. This creates a unique visual encoding/encryption specific to the particular transaction document.


Finally, the two 2-D barcodes are overlapped to form a single, color visual barcode-like representation, employing multiple colors for various combinations of overlapped cells, according to a cell mixing algorithm.


At a receiving end, the two original 2-D barcodes can only be separated by software which employs a proprietary, reverse algorithm to extract the two black and white 2-D barcodes from the combined color code. One of the two retrieved 2-D barcodes, which stored the document data in plaintext form, is then used to retrieve the original document data. The other retrieved matrix symbol has the ciphertext document data.


The original document data is then employed to generate a second ciphertext 2-D barcode, according to the same algorithm employed above. If the two 2-D barcodes—the original and the newly generated—are a match, this indicates that data integrity has been maintained. If the two 2-D barcodes—original and newly generated—are not a match, this indicates that data integrity has been lost or tampered with.


In an embodiment, the system and method employs fingerprints from the transaction participants for document signatures or validation. The system and method entails employing the unique data values for the particular transaction document (such as data from a bill of sale) to generate a unique fingerprint shuffling sequence.


Each fingerprint, from each transaction participant (for example, seller and buyer), is broken into multiple image parts. For each fingerprint, these images parts are then spatially shuffled according to the unique fingerprint shuffling sequence.


Finally, the two shuffled fingerprint images are combined into a single, overlapping visual representation, employing multiple colors, according to a proprietary image overlap algorithm.


At a receiving end, the fingerprint shuffling sequence can be recovered from the original document data. At the receiving end, the two original shuffled fingerprint images can then be separated by software which employs a proprietary, reverse shuffling algorithm. This employs a reverse shuffling sequence to extract the two original black and white fingerprints from the combined, color shuffled fingerprint. The fingerprints can then be validated against separately stored records of fingerprints of the appropriate parties.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a system diagram of an exemplary electronic device according to the present system and method.



FIG. 2 illustrates an exemplary sales invoice with unique sales data.



FIG. 3. is a flow chart of an exemplary method for creating a password, an image shuffling sequence, or other encryption-related data values for a document, where the encryption-related data value(s) is based on unique document content.



FIG. 4A is a schematic representation of the conversation of plaintext alphanumeric transaction data to a 2-D barcode.



FIG. 4B is a schematic representation of the conversation of ciphertext transaction data to a 2-D barcode.



FIG. 5 schematically illustrates an exemplary method for merging a first 2-D barcode and second 2-D barcode into a merged 2-D multicolor barcode image.



FIG. 6 illustrates an exemplary method to convert a graphical biometric signature, such as a fingerprint image, into a spatially shuffled image of the biometric signature.



FIG. 7 schematically illustrates an exemplary method for merging a first shuffled biometric image and a second shuffled biometric image into a merged multicolor biometric image.



FIG. 8 illustrates concatenating a 2-D barcode data image and a biometric signature image into a combined final document image.



FIG. 9 illustrates methods for direct transmission and for indirect transmission of a graphical image which encodes data, and also for decoding and validating the received graphical image.



FIG. 10 is a flow-chart of an exemplary method to encode commercial transaction data and transaction signatures into a secure, self-verifiable image format.



FIG. 11 is a flow-chart of an exemplary method for data recovery from a secure, self-verifiable transaction image and for validation of the recovered data by a receiving party.





DETAILED DESCRIPTION

In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments. However, one skilled in the art will understand that the invention may be practiced without these details. In other instances, well-known structures associated with computers, tablets, cell phones, or with other digital devices, and/or with data display, and/or with data storage or data transmission, have not been shown or are not described in detail to avoid unnecessarily obscuring descriptions of the embodiments.


Unless the context requires otherwise, throughout the specification and the claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open sense, that is as “including, but not limited to.”


Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.


The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.


Reference Numbers: Reference numbers are used throughout the figures. The first digit of a reference number generally indicates the first drawing where the associated element appears. For example, an element with reference number “207” first appears in FIG. 2.


In some instances, an element may be shown in both a generic form and a more specific form or species; in these cases, the specific form or species may be indicated by an appended period (“.”) followed by a digit or digits to distinguish a species of the general form. For example, a general fingerprint image may have a reference number of 605; while a first fingerprint image of a first person may have a reference number 605.1, and a second fingerprint image of a second person may have a reference number of 605.2.


In this document, an exemplary sales invoice is used to as an example to illustrate an exemplary transaction document. Persons skilled in the relevant arts will appreciate that the present system and method may be applied to any document which needs to be protected from tampering.


Portable Data Receiving and Display System (Tablet)


The present system and method may enable recording of date pertaining to workflow transactions, including financial and sales data such as sales transaction data, at a field location such as a retail store. Sales transactions are just one exemplary case of many possible applications of the present system and method. The present system and method is not limited to sales, and persons skilled in the relevant arts will appreciate that the present system and method can be employed with and applies to many kinds of workflow transactions.


The present system and method may employ a portable computing system or processing system, such as a tablet computer 100, to record, store, display, and transmit data. The data may include both numeric and text data, and also graphical or image data, such as an image of a human fingerprint captured via a biometric scanner. The biometric scanner may be an internal module, or may be an external (stand-alone) device which may be coupled to the tablet computer 100 via a wired or wireless link.


Other related technologies may be employed as well, including: a dedicated portable electronic system for sales, invoicing, and receipts; cellular phones; smart phones; personal digital assistants; vehicle-mounted computers; stationary (e.g., desktop) computers; or distributed computers (e.g., network systems).



FIG. 1 schematically depicts a system diagram of an exemplary electronic tablet 100, such as a portable tablet computer 100, according to the present system and method. Tablet 100 according includes a processor or microprocessor 105, dynamic memory (RAM) 107, and user interface elements. RAM 107 is configured to store computer code or instructions which enable processor 105 to perform the present system and method in accordance with methods and steps described further below in this document. RAM 107 is also configured to store data which may change dynamically in the course of implementing the presenting system and method, including for example transaction data and image or graphic data which may be generated by the present system and method.


Tablet 100 may employ static memory 109 for long-term storage of operating instructions, an operating system, and long-term data storage. Static memory 109 may include Read-Only Memory (ROM), also known as non-volatile memory (NVRAM), a hard disk drive, flash drives and other removable-but-non-transitory storage, CD ROMs, PC-CARDs, memory cards, and/or other non-transitory storage media and code storage media as may be developed in the future.


Tablet 100 may also include one or more wireless communication systems 115. These wireless systems 115 may enable communication with other electronic devices and with computer networks via wireless networks according to such protocols as Bluetooth, WiFi, cellular communications, a local area network (LAN), an ad hoc network, a cellular network (e.g., a GSM network, a CDMA network, or an LIE network), and other protocols and wireless systems well known in the art or yet to be developed.


Tablet 100 may also include ports (not shown in FIG. 1) and suitable electronic support for wired communications 111 via USB, Ethernet, and other protocols known in the art.


Tablet 100 according to the present disclosure may include a display monitor 120 for display of user data and graphical images, and also for display of elements such as menus, dialog boxes, and display buttons which enable or facilitate user control of the present system and method.


Tablet 100 may include a keyboard 130 for entry of text and numeric data. In an alternative embodiment, display and keyboard functions may be integrated into a touch-screen (or pressure-sensitive) display 120 as is well known in the art.


Tablet 100 may include a mouse or touch-pad 140, also well-known in the art, for further control of a display pointer and other elements visible on monitor 120. In an alternative embodiment, mouse functions may be integrated into a touch-screen display 120 as well.


Tablet 100 may include audio user-interface elements 150 as well. These may include speakers, headphones, and/or a microphone. The microphone may be configured to receive verbal input from a user, and suitable software or firmware may translate the audio input into text and numeric data to be displayed on display 120, or may translate the audio input into control commands to operate tablet 100 according to the present system and method.


Other elements and means for user input and output, such as eye-glasses with optical tracking and interaction capabilities, or holographic display and interaction systems, may also be envisioned within the scope and spirit of the present system and method.


Tablet 100 may include a biometric sensor 160. Biometric sensor 160 may be used to identify a user or users of device 100 based on unique physiological features. In an embodiment, biometric sensor 160 may be a fingerprint sensor used to capture an image of a user's fingerprint. In an alternative embodiment, biometric sensor 160 may be an iris scanner or a retina scanner configured to capture an image of a user's iris or retina. As noted above, in an embodiment biometric sensor 160 may be an internal module of tablet 100. In an alternative embodiment, biometric sensor 160 may be an external (stand-alone) device which may be coupled to tablet computer 100 via a wired link (e.g., USB) or wireless link (e.g., Bluetooth connection).


In an embodiment of the present system and method, tablet 100 may include a camera 170 for capture of images. In an embodiment, camera 170 may be used to capture images of documents, such as sales transaction documents, which may be displayed on paper or by other electronic devices. In an embodiment, camera 170 may serve multiple roles, including serving effectively as biometric sensor 160 for the capture of user fingerprints, iris scans, retinal scans, or even facial images of parties to a sales transaction.


The processor 105 is communicatively coupled via a system bus 195 to memory 107, NVRAM 109, wired connections 111, wireless transceivers 115, monitor 120, keyboard 130, mouse 140, audio I/O 150, biometric sensor 160, camera 170, and to such other hardware devices as may be necessary or helpful to implement the present system and method. Tablet 100 includes suitable electronic support for bus 195 to mediate interactions between all elements of device 100.


Typically, processor 105 is configured to execute instructions and to carry out operations associated with the tablet 100. For example, using instructions retrieved from the memory 107 (e.g., a memory block) and/or static memory 109, processor 105 may control the reception and manipulation of input and output data between internal components of the tablet 100. Processor 105 typically operates with an operating system to execute computer code and to produce useful data. The operating system, other computer code, and data may reside within memory 107 and/or memory 109 that is operatively coupled to processor 105 via bus 195.


The operating system, other computer code, and data may also be hard-coded into tablet 100 either as dedicated logic within processor 105 or as non-volatile memory known as firmware 109.


In an embodiment, the instructions and data employed by the electronic device may be organized into one or more software modules, firmware modules, drivers, or other programs. Such modules may be implemented, in whole or in part, as one or more of: dedicated logic in processor 105; firmware 107; and dedicated, specialized processors (not shown in FIG. 1).


Exemplary modules which may be employed may include:

    • a password generator to generate unique passwords based on selected input data;
    • a sequence generator to generate a sequence of numbers based on selected input data;
    • a 2-D barcode generator to generate two-dimensional (2-D) barcodes codes (such as matrix codes) from text and numeric data;
    • image processing modules configured to superimpose, to spatially rearrange, and/or color-adjust images or portions of images; and
    • a random number generator which generates random numbers (or pseudo-random numbers).


It will be understood by persons skilled in the relevant arts that the above indicated modules may employ any of several algorithms and methods well known in the art, or may employ new or novel methods yet to be developed.


Transaction Documents (Bills, Receipts, Contracts)


The present system and method is explained herein largely with reference to an exemplary sales transaction and in particular an exemplary sales invoice 200. This is to be understood as being by way of example and illustration only, and should not be construed as limiting. The present system and method is applicable to many different types of documents and records which record or in other ways pertain to or may be associated with mutual agreements between two parties, including for example and without limitation: contracts, bids, licensing agreements, proposals, technical specifications, requirements documents, service agreements, parts lists, deeds, liens, letters of understanding, title documents, receipts, easements, and covenants.



FIG. 2 illustrates an exemplary sales invoice template 200 (also referred to herein, interchangeably, simply as the “sales invoice 200”) which is populated with exemplary sales data values 240. The data 240 may be seen for example on a display screen 120 of portable tablet computer 100. Display screen 120 may be configured to display the invoice template 200 which can be updated with new, unique data 240 as new sales transactions occur. Display screen 120, and in particular invoice 200, may also be configured to accept data entry either directly on display 120 (for example via touch-screen entry), or via a keyboard 130, audio input/output 150, or other means.


Template 200 may include various display buttons, menus and other user-interface elements, not shown in FIG. 2, to facilitate entry and updating of new values. For example, a [New] button or icon may indicate that the template should be cleared so that a new transaction may be recorded, or a [Save} button or icon may indicate that newly entered data should be saved in memory 107. Other data- and transaction-oriented interface elements, known in the art, may be present as well.


In practical, real-world application, each transaction bill (invoice) or receipt will likely have some kind of unique and critical information. Normally each company maintains the predefined bill and receipt templates 200 (or bill and receipt formats 200) so that new transactions can be defined in the field, as needed.


Multiple unique transaction values 240 may be combined into a single data set which is itself unique. Further, through various algorithms, a unique data set may be employed to generate a unique encryption key or password (see FIG. 3 below), a unique sorting sequence (see FIG. 3 below), or other unique values, as discussed further below.


Examples of transaction data 240 which are typically essential to each transaction, and which are unique or distinctive may include, for example without limitation:

    • Date with Time (240.1)
    • Invoice Number (240.2)
    • Total Amount Due (240.3)


Persons skilled in the art will recognize that some such data may not be universally unique, but will have widely varying values across many transactions. For example, a Total Amount Due 240.3 may occasionally have the same exact value in two completely separate transactions, but will generally vary greatly from one transaction to another, and may have a very large range of possible values.


Other invoice values may be common between multiple transactions, but may be unique between different vendors or customers. They may be referred to as “semi-unique transaction values” or “distinctive transaction values”. These collectively may still be used as elements of a combined data set which is, in totality, unique among all commercial transactions. Additional invoice values which may be employed within the scope of the present system and method include, for example without limitation:

    • Sender ID (240.4)
    • Name of the Sender (Distributor) (240.5)
    • Name of the Receiver (240.6)
    • Receiver Company Name (240.7)
    • Receiver Company ID (not shown in FIG. 2)
    • Receiver Company phone number or e-mail (240.8).


The above identified transaction values 240 are exemplary only. Other billing, receipt, and transaction values 240 may be employed as well on transaction template 200. These exemplary values 240, and similar values, are the parameters are identified in transaction template 200. These parameters 240 may be employed for generating passwords and some of other required elements of the present system and method, as discussed further below.


Generation of a Document-Unique Password and Shuffling Sequence



FIG. 3. is a flow chart of a generalized method 300 for creating a password 360, an image shuffling sequence 365, or some other signature data value which is substantially unique and distinct for a given document, such as a sales invoice 200.


In step 305 of method 300, designated critical transaction data values 240 are extracted from a sales invoice, bill, receipt, or similar transaction document 200. Exemplary transaction values 240 were discussed above in conjunction with FIG. 2. These values 240 serve as parameters 240 for password and shuffling sequence generation.


In step 310 of method 300, the extracted data values 240 are provided as input parameters 240 to suitable algorithms 350 for key generation. Key generation algorithms 350 may include, for example and without limitation a password generating algorithm 350.1 or an image shuffling sequence algorithm 350.2. Other algorithms may be employed as well.


In an embodiment, the present system and method employs one or more unique (that is, proprietary, not-publicly known, not-commercially available, not open-source, etc.) key generation algorithms 350 for enhanced data security. General design methods and principles for the development of such algorithms are known in the art, and so specific, possible proprietary algorithms 350 are not discussed here.


In an alternative embodiment, the present system and method employs one or more publicly known, commercially available, or open-source key generation algorithms 350.


In an alternative embodiment, multiple encryption algorithms 350 may be employed (with the same algorithms or different algorithms), applied for example consecutively, and employing either of general (known) encryption algorithms or proprietary encryption algorithms, or both.


In step 315 of method 300, the algorithm(s) 350 output one or more unique keys or key sequences 360,365 which are to be employed for data encryption, as described further below in this document. Such keys may include, for example and without limitation:

    • one or more unique encryption passwords 360 or encryption keys 360 to be employed as part of a process for encrypting text and/or numeric data; and
    • an image shuffling sequence 365, such as a fingerprint shuffling sequence 365, which may be employed to spatially shuffle parts of an image into an apparently random or pseudo-random order. The sequence may map a default, ordinal ordering of spatial elements (1, 2, 3, 4, etc.) to an apparently random spatial ordering sequence (for example, 5, 2, 11, 3, etc.). Shown in FIG. 3 is an exemplary specific fingerprint shuffling sequence 365.1. The sequence 365.1 shown is arbitrary and purely for illustration, and many other sequences may be generated by the present system and method.


In an embodiment of the present system and method, additional or alternative sequences or patterns may be generated as well from the unique document data. For example, in addition to shuffling sub-parts of a biometric image, parts or segments of a biometric image may be flipped (horizontally and/or vertically) as well, or rotated; the present system and method may generate suitable flip or rotation sequences, or other image alteration patterns, as required, from the unique document data.


Generation of 2-D barcodes for Transaction Data (Plaintext and Ciphertext)


In an embodiment, the present system and method generates two visual representations of transaction data 240. The first geometric representation 410.1 is of the data 240 in its native (unencrypted or plaintext form), while the second geometric representation 410.2 is of the data 240 in an encrypted form (also referred to as “ciphertext” 425, see FIG. 4B below).


In an embodiment, the geometric representation 410 is in the form of a two-dimensional barcode. In an embodiment, the geometric representation 410 is in two colors only, or “monochrome”, the two colors often being (as is conventional in the art) black and white. More generally, in an embodiment, the geometric representation 410 is in two colors only, or “monochrome”, one being a first high intensity color and the second being a second low intensity color, so as to provide a strong intensity contrast for clarity of scanning.


In an embodiment of the present system and method, the 2-D barcodes 410 may be QR codes 410. In alternative embodiments the geometric representations may take other forms well known in the art, such as Aztec codes, Data Matrix, MaxiCode, PDF 417, stacked (multi-row) barcodes, or other such two-dimensional geometric codes known or yet to be developed.


In this document, the terms “barcode” 410, “two dimensional barcode” 410, “2-D barcode” 410, “geometric data representation” 410, “matrix code” 410, and “QR code” 410 are used interchangeably to reflect geometric representations of alphanumeric data 240 which are typically two-dimensional, though in some embodiments may be strictly linear (one dimensional).


2-D barcode for Plaintext (Unencrypted) Data:



FIG. 4A is a schematic representation of the conversion of alphanumeric transaction data 240 to a first 2-D barcode 410.1 according to an embodiment of the present system and method. In an embodiment, the present system and method extracts the unique transaction data 240 from the invoice, bill, or other document 200 for the transaction. (In an embodiment, the transaction data 240 may already have been extracted to create a password, as described above in relation to FIG. 3. Such transaction data 240 may then be readily available, for example, in memory 107 of tablet computer 100.)


In an embodiment, the transaction data 240 is then processed according to known algorithms 405 for generating 2-D bar codes 410, such as a QR code or other matrix code, stacked barcode, etc. In an alternative embodiment, a proprietary conversion algorithm 405 may be employed to generate a proprietary 2-D barcode 410. In FIG. 4A, the algorithm is represented schematically by an arrow 405. Such known algorithms 410 will be well understood by persons skilled in the relevant arts, and are not covered in detail here.


The result is a first 2-D barcode 410.1, which encodes the plaintext transaction data 240, that is, the transaction data without encryption. Persons skilled in the relevant arts will recognize that the transaction data 420 may be readily recovered from 2-D barcode 410 through decoding algorithms generally known in the art, or through a proprietary decoding algorithm for a proprietary 2-D barcode.


2-D barcode for Ciphertext (Encrypted) Data:



FIG. 4B is a schematic representation of the conversion of alphanumeric transaction data 240 to a second 2-D barcode 410.2 according to an embodiment of the present system and method. In an embodiment, the present system and method extracts the unique transaction data 240 from the invoice, bill, or other document 200 for the transaction. (In an embodiment, the transaction data 240 may already have been extracted in previous steps. Such transaction data 240 may then be readily available, for example, in memory 107 of tablet computer 100 or any other applicable processing device.)


In an embodiment, the unencrypted transaction data 240 (generically referred to in the art as “plaintext”) is encrypted using an encryption algorithm 420 or algorithms 420. In an embodiment, the encryption algorithm 420 will employ, as a key or password, the encryption password 360 previously calculated from the transaction data 240 (see FIG. 3 above). In this way, the data encryption process is specifically keyed or tied to the unique plaintext transaction data 240 for the specific transaction under consideration.


The output result is an encrypted transaction data 425 (generically referred to in the art as “ciphertext”), which may be ciphertext invoice data 425, ciphertext receipt data 425, or other ciphertext transaction data 425. In an embodiment, the ciphertext data 425 is generally not understandable or meaningful as presented, and in an embodiment (“private key”) can generally only be read if first decrypted employing the same encryption password 360.


In an embodiment, the encryption algorithm 420 may be any of several known algorithms, including for example and without limitation: Pretty Good Privacy (PGP), Triple DES, RSA, Blowfish, Twofish, and AES. In an alternative embodiment of the present system and method, a proprietary encryption algorithm 420 may be employed.


In an alternative embodiment of the present system and method, additional encryption passwords 460 may be employed in addition to the encryption password 360 calculated from transaction data 240. In an embodiment, one or more additional encryption passwords 460 may be calculated based on the same unique transaction data, but employing different password generating algorithms 360. Other sources of additional encryption passwords 460 may be envisioned as well, including for example and without limitation a password which has been privately shared (via other systems and methods, such as e-mail or secure network connection) between the two parties to a sales transaction.


The ciphertext transaction data 425 is then processed according to known algorithms 405 for generating a 2-D barcode 410.2, such as a QR code. In an alternative embodiment, a proprietary conversion algorithm 405 may be employed to generate a proprietary 2-D barcode 410.2. In FIG. 4B, the algorithm is represented schematically by an arrow 405. Such known algorithms 410 will be well understood by persons skilled in the relevant arts, and are not covered in detail here.


The result is a second 2-D barcode 410.2, which encodes the ciphertext transaction data 425.


Merging/Combining the Two 2-D Barcodes into One Image


In an embodiment of the present system and method, the two 2-D barcodes generated above—2-D barcode 410.1 representing the plaintext transaction data, and 2-D barcode 410.2 representing the encrypted transaction data (ciphertext)—are merged into a single combined 2-D barcode image 510, referred to equivalently and interchangeably herein as the “merged 2-D barcode image” 510, “merged image” 510, “merged matrix code image” 510, “multicolored matrix code” 510, and similar terms.


The combined 2-D barcode image 510, or merged image 510, is generated in such a way that a third-party who is viewing or parsing the merged image 510 would generally not be able to extract any data from the image 510. This is because a third-party would require access to a proprietary algorithm to reconstruct the two original 2-D barcodes 410. At the same time, with access to the appropriate algorithm, the two initial 2-D barcodes 410 can later be recovered from the merged image 510.



FIG. 5 provides a schematic illustration of an exemplary method 500 for merging plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 into a merged 2-D barcode image 510.


In an embodiment, plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 are graphically merged by combining each corresponding cell from the two source 2-D barcodes 410, and generating a designated color cell 515 for different combinations of source cells. “Corresponding cells” are cells which share a same coordinate position; for examples, cells both in row X, column Y of plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 are corresponding cells.


Understood another way, plaintext 2-D barcode 410.1 may be overlapped directly on top of ciphertext code 410.2 (or vice versa). The overlap of any two source cells generates a cell combination 515 which may be assigned a specific color.


In an exemplary embodiment illustrated in FIG. 5:

    • a cell combination 515.1 of two white cells generates a resulting white cell;
    • a cell combination 515.2 of two black cells generates a resulting black cell;
    • a cell combination 515.3 of a white cell from plaintext 2-D barcode 410.1 overlapping a black cell from ciphertext 2-D barcode 410.2 generates a resulting red cell; and
    • a cell combination 515.4 of a black cell from plaintext 2-D barcode 410.1 overlapping a white cell from ciphertext 2-D barcode 410.2 generates a resulting blue cell.


The above combinations, as illustrated in FIG. 5, are exemplary only. Other combination schemes may be envisioned within the scope and spirit of the present system and method. For example, in an alternative embodiment, two overlapped white cells could generate a black cell, or could generate yet another color, such as green.


In an alternative embodiment, a designated cell combination could be randomly mapped to any of several designated color options for merged cells. For example, a merging 515.3 of a white cell over a black cell could be mapped randomly to any of white, red, or brown. For another example, a merging 515.4 of a black cell over a white cell could be mapped randomly to any of blue, orange, or violet. For third-parties who do not know the proprietary mapping algorithm, the use of additional possible output colors may further confound any efforts to “unmerge” or separate the merged image 510 into the original two 2-D barcodes 510.


The output result of the process is the merged, 2-D barcode 510 (which in FIG. 5 is illustrated in cells with shades of gray rather than in color).


Sizes (Cell Dimensions) of the 2-D Barcodes: It will be noted that the method 500 may be applied to two 2-D bar codes 410 which are of equal size (that is, equal horizontal numbers of cells and equal vertical numbers of cells). The method 500 may also be applied to a first 2-D barcode 410 of a first size and a second 2-D barcode 410 of a different second size, as illustrated in FIG. 5. (For example, the ciphertext data 425 may be reduced in length compared to the plaintext data 240, or greater in length. The resulting ciphertext barcode 410.2 may be respectively smaller or larger than the plaintext barcode 410.1.) In the latter case, some cells of the larger barcode 410 may be not-merged, or remain “unmerged”, with any cells of the smaller barcode 410. The merger of the remaining, overlapping cells of the two barcodes 410 still provides security against third-parties attempting to retrieve the two original barcodes. In an embodiment of the present system and method, unmerged cells may remain in their original black and white colors in the merged barcode 510. In an alternative embodiment, unmerged cells may still be mapped to alternative colors, to render the merged 2-D barcode 510 still less susceptible to decoding by unauthorized parties.


Cell-by-Cell and Pixel-by-Pixel: It will be noted that 2-D barcode merging has been described above as being implemented on a cell-by-cell basis. Persons skilled in the relevant arts will appreciate that, in an alternative embodiment, the color combination algorithms of the present system and method may be implemented instead on an image-pixel by image-pixel basis.


Exemplary Code: Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to merge plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 into a merged, 2-D barcode image 510:














    {// Merge two pixels of same color


    if Actual_Invoice_QRCode(x,y).PixelColor ==


Encrypted_Invoice_QRCode(x,y).PixelColor  Then


    Copy Encrypted_Invoice_QRCode(x,y).PixelColor to


MergedQRCodeImage(x,y).PixelColor;


    // Merge two pixels of different colors


    if Actual_Invoice_QRCode(x,y).PixelColor !=


Encrypted_Invoice_QRCode(x,y).PixelColor


    {if Actual_Invoice_QRCode(x,y).PixelColor==“WHITE”


&& if Encrypted_Invoice_QRCode(x,y).PixelColor==“BLACK” Then


    MergedQRCodeImage(x,y).PixelColor = “RED”


    if Actual_Invoice_QRCode(x,y).PixelColor==“BLACK” &&


if Encrypted_Invoice_QRCode(x,y).PixelColor==“WHITE” Then


    MergedQRCodeImage(x,y).PixelColor = “BLUE”


    }









Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other color combinations may be employed within the scope and spirit of the present system and method.


The resulting, merged, 2-D barcode 510 encodes, in full, both the original plaintext transaction data 240 and the encrypted transaction data 425. Merged image 510 can be transmitted to either party to the transaction (for example seller or purchaser) as a graphical image. As will be discussed further below, merged image 510, and the transaction data which it contains, can be readily validated at the receiving end to establish whether the received data is valid or if instead the received data has been corrupted in any way.


Paper-free Bill or “e-bill”: In an embodiment of the present system and method, the merged image 510 may constitute a final, complete paper-free bill, which encodes all transaction data 240 in an image. In an alternative embodiment (discussed further below, see FIGS. 6, 7, and 8), the paper free bill or e-bill may be constituted by the merged image 510 further concatenated with a biometric signature image 712 or other electronic signature.


Transaction Affirmation Via Digitized Biometric Signatures


Conventionally, business transactions such as sales transactions are affirmed, witnessed, or validated by one or both parties to the transaction. For printed documents, a commonly used form of affirmation is the written signature of one or both parties.


In embodiments of the present system, the electronic transactions document(s) created, recorded, and transmitted may be affirmed by biometric signatures 605 which are graphical (that is, image-oriented) in nature. In various embodiments, possible biometric signatures 605 may include, for example, and without limitation: fingerprint images, palm-print images, iris scans, retinal scans, and even facial photos.


In an embodiment of the present system and method, the biometric signature 605, such as a fingerprint 605, is in two colors only, or “monochrome”, the two colors often being (as is conventional in the art) black and white, or two other colors with strong relative contrast. In an embodiment, the biometric signatures 605, such as a fingerprint 605 or handprint, or even an iris print, retinal scan, or facial photo, may originally be captured in grayscale or multiple colors. The grayscale or color image may be reduced to a two-color monochrome, such as black and white pixels, via known image processing methods, while retaining image quality and retaining essential image features and image data for purposes of biometric identification of persons.


In an embodiment of the present system and method, a biometric signature of both parties to the transaction is captured via an imaging device or scanner, such as biometric sensor 160 or camera 170 of tablet device 100. The biometric images 605 can be concatenated to the merged 2-D barcode image 510 with the document data, as described further below.


In an embodiment of the present system and method, the graphical biometric signatures 605 may be scrambled or encoded in a variety of ways prior to concatenation and digital transmission. The scrambling or encoding help ensure that, if the digital transaction document (e-bill) is improperly intercepted or otherwise obtained by third-parties (not parties to the transaction), the biometric signatures 605 cannot be recognized or interpreted. In this way, only legitimate signatory parties to the financial transaction can read and validate the biometric signatures 605.



FIGS. 6 and 7 illustrate successive stages of an exemplary method of scrambling/encoding biometric document signatures 605. While the figures employ fingerprints 605 by way of example, persons skilled in the relevant arts will recognize that the exemplary method illustrated is readily applicable to other biometric signatures such as hand-prints, iris scans, retinal scans, or facial photos.


Fingerprint Shuffling: FIG. 6 illustrates an exemplary biometric signature 605, in this case fingerprint 605. In an embodiment, a captured fingerprint 605 of a transaction participant is captured, for example via biometric sensor 160 or camera 170 of tablet device 100, with sufficient resolution to capture needed details of the signature. In an embodiment, an image resolution of 512 dots per inch (DPI) may be employed. Other resolutions may be employed as well.


Once captured, fingerprint 605 is spatially divided into subparts 607 via image processing software running on processor 105 or other firmware of tablet 100. In the exemplary embodiment shown, fingerprint 605 is spatially divided into sixteen (16) separate sub-parts 607 of equal width/height dimensions; other numbers of sub-parts 607, greater or lesser, may be employed as well.


As discussed above (see FIG. 3), processor 105 or other firmware of tablet 100 is used to generate an image shuffling sequence 365 based on transaction data 240. FIG. 6 illustrates a specific exemplary shuffling sequence 365.1 (the same as illustrated in FIG. 3 above). The sequence 365.1 shown is arbitrary and purely for illustration, and many other document-specific sequences may be generated by the present system and method based on transaction-specific data 240.


The fingerprint shuffling sequence 365 maps the original image from its original spatial ordering to a new spatial ordering. In an exemplary embodiment, the subparts 607 may be numbered sequentially, starting with number ‘1’ in an upper-left corner, and incrementing by one (1) reading from left-to-right and top-row to bottom-row as shown. Each sub-part 607 is then mapped from its original location to the formal location of another sub-part 607. The result is a shuffled fingerprint image 615.


Two Shuffled Fingerprints: With reference now to FIG. 7, and in an embodiment, fingerprint images 605.1 and 605.2 are provided by a respective first party to the financial/sales transaction and a second party to the financial/sales transaction. The signatures 605 serve as affirmations or validations of the transaction(s) by the respective parties.


In an embodiment of the present system and method, two fingerprint images 605.1, 605.2 are each shuffled into two respective shuffled fingerprint images 615.1, 615.2. In an embodiment of the present system and method, both of fingerprint images 605.1, 605.2 are shuffled according to a common or same fingerprint shuffling sequence 365. In an alternative embodiment, the present system and method may generate two different fingerprint shuffling sequences 365.1, 365.2; and the present system and method may then shuffle each of fingerprint images 605.1, 605.2 according to the respective first and second fingerprint shuffling sequences 365.1, 365.2.


Merging/Combining the Two Shuffled Fingerprints into One Image: In an embodiment of the present system and method—and similar to the manner in which two black-and-white 2-D barcodes 410 may be merged into a single, multi-colored 2-D barcode 510 (see FIG. 5 above)—shuffled fingerprints 615.1 and 615.2 (which may be black and white, as is conventional in the art)—are merged into a single, combined multi-colored shuffled fingerprint image 712, referred to equivalently herein as the “merged fingerprint image” 712.


The merged fingerprint image 712 is generated in such a way that a third-party who is viewing or parsing the merged fingerprint 712 would generally not be able to extract the two original shuffled fingerprints 615.1, 615.2. This is because a third-party would require access to a proprietary algorithm to reconstruct the two original shuffled fingerprints 615.1, 615.2. At the same time, with access to the appropriate algorithm, the two initial shuffled fingerprints 615.1, 615.2 can later be recovered from the merged fingerprint image 712.


In an embodiment, shuffled fingerprint 605.1 and shuffled fingerprint 605.2 are graphically merged by combining each corresponding pixel from the two source shuffled fingerprints 605, and generating a designated color pixel 715 for different combinations of source pixels. “Corresponding pixels” are pixels which share a same coordinate position, that is, both pixels are in row X, column Y of respective shuffled fingerprints 615.1 and 615.2. This merging process is illustrated diagrammatically in FIG. 7 via process arrow 705.


Understood another way, shuffled fingerprint 615.1 may be overlapped directly on top of shuffled fingerprint 615.2 (or vice versa). The overlap of any two source pixels generates a merged pixel combination 715 which may be assigned a specific color.


In an exemplary embodiment illustrated in FIG. 7:

    • a pixel combination 715.1 of two white pixels generates a resulting black pixel;
    • a pixel combination 715.2 of two black pixels generates a resulting white pixel;
    • a pixel combination 715.3 of a white pixel from shuffled fingerprint 615.1 overlapping a black pixel from shuffled fingerprint 615.2 generates a resulting blue pixel; and
    • a combination 715.4 of a black pixel from shuffled fingerprint 615.1 overlapping a white pixel from shuffled fingerprint 615.2 generates a resulting red pixel.


The above combinations, as illustrated in FIG. 7, are exemplary only. Other color combination schemes may be envisioned within the scope and spirit of the present system and method.


In an alternative embodiment, a designated pixel combination could be randomly mapped to any of several colors of merged pixels. For example, a merging 715.3 of a white pixel over a black pixel could be mapped randomly to any of Red, Yellow, or Brown. For another example, a merging 715.4 of a black pixel over a white pixel could be mapped randomly to any of Blue, Orange, or Violet. For another example, two overlapped white pixels or two overlapped black pixels could be mapped to colors other than black or white. For third-parties who do not know the proprietary mapping algorithm, the use of additional possible output colors may further confound any efforts to “unmerge” or separate the merged image 712 into the original two shuffled fingerprints 615.1, 615.2.


Code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) may be employed to merge shuffled fingerprints 605.1, 605.2 into a merged, multi-colored shuffled fingerprint 712. Such code may be the same or substantially similar to that discussed above in conjunction with FIG. 5, and will not be repeated here.


The resulting merged, multi-colored shuffled fingerprint image 712 combines in full both of the original shuffled fingerprints 615.1, 615.2. Merged image 712 can be transmitted to either party to the transaction (for example seller or purchaser) as a graphical image. As will be discussed further below, merged image 712 can be readily validated at the receiving end to confirm that the fingerprints 605.1, 605.2, which were generally obtained at the point-of-transaction, match expected fingerprints (such as fingerprints stored in a suitable database [such as an employee database, customer database, etc.] or fingerprints obtained live from DSD personnel, in real-time, at the place and time of fingerprint validation).


Final Transaction Document Image


In an embodiment of the present system and method, a final transaction document image 805 is formed by combining or concatenating the merged 2-D barcode image 510 (see FIG. 5 above) with the merged shuffled fingerprint image 712 (see FIG. 7 above). FIG. 8 illustrates an exemplary concatenation of a merged 2-D barcode image 510 with a merged shuffled fingerprint image 712, resulting in a final transaction document image 805.


In an embodiment, the final transaction document image 805 is composed by spatially placing the two source images (merged 2-D barcode image 510 and merged shuffled fingerprint image 712) side-by-side, or one spatially above or spatially below the other; or, put another way, by placing the two source images 510, 712 spatially adjacent to each other. In an embodiment, a border or borders 807 of a designated color or shading may be placed around either or both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712. Border 807 may help distinguish the two separate images within the overall final transaction document 805. Border 807 may also help ensure that the final transaction document 805 has a conventional rectangular image shape.


In alternative embodiments, other two-dimensional spatial arrangements may be made between merged 2-D barcode image 510 and merged shuffled fingerprint image 712 to compose the overall final transaction document image 805. In an embodiment, the sizes (pixel dimensions) of merged 2-D barcode image 510 and shuffled fingerprint image 712 can be adjusted in final transaction document image 805 for easy transfer and decoding.


In an alternative embodiment (not illustrated), additional image modifications may be made to final transaction document image 805. These may entail for example applying one or more data-lossless image deformations, such as skewing, rippling, or twirling, to FTD image 805. At a receiving end, and for purposes of data recovery, the present system and method would then entail first applying a suitable reverse deformation (deskew, unripple, untwirl, etc.), with suitable inverse deformation parameters, to the received FTD image 805.


In an embodiment, within final transaction document image 805, each and both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 are so arranged that the two source images 510, 712, when concatenated, may still be readily distinguished from each other; and equally, so that each source image 510, 712 is not obscured by the other. This ensures that the data contained in each of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 is not obscured by the other image. It further ensures that, on the receiving end, image processing software can readily extract and separate both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712.


In an alternative embodiment, within final transaction document image 805, each and both of merged 2-D barcode image 510 and merged shuffled fingerprint image 712 may be so arranged that the two source images 510, 712 partially or wholly overlap, with for example still further, additional pixel color mappings for overlapped pixels. In such alternative embodiments, additional algorithms on the receiving end would separate the two source images 510, 712, according to suitable algorithms to distinguish overlapping pixels.


Paper-free Bill or “e-bill”: In an embodiment of the present system and method, the final transaction document image 805, which encodes all transaction data 240 in an image and also includes a biometric signature image 712, constitutes a “paper-free bill”. This may also be referred to equivalently as an “e-bill”, or similarly a “paper-free-” or “e-” invoice, receipt, transaction document, etc., for example a paper-free invoice, an e-invoice, an e-receipt, or by similar transaction document terms as applicable.


Image Data Transfer: Local Transfer, and Transfer from Point-of-Transaction to Remote Offices


It will be noted that, at the point-of-sale, where final transaction document (FTD) image 805 is created, the FTD image 805 may be transferred between transaction parties via local wired or wireless communications. For example, if FTD image 805 is created on tablet 100 (for example, by the seller), FTD image 805 may then be transferred to the buyer's tablet or to a cellular phone via a wireless connection such as Bluetooth or a local WiFi network. FTD image 805 may also be transferred from the seller to the buyer's cellular phone or tablet via a wired connection such as USB or Ethernet.


With reference now to FIG. 9, in an embodiment of the present system and method: Final transaction document (FTD) image 805, which is generally created at the point-of-service on tablet 100, can then be transmitted to appropriate business offices or other parties for processing and validation. For example, FTD image 805 may be sent to the business offices of a buyer and seller who engaged in a sales transaction at the point-of-service.


Direct Transmission: In an embodiment, FTD image 805 may be transmitted by a first transmission process 915.1, for example via conventional e-mail, via FTP (file transfer protocol), cellular network transmission, or similar, from tablet 100 to the receiving parties. FTD image 805 may be sent in any known conventional document format, such as JPG, PNG, TFF, BMP, GIF, and others well known in the art, provided the image format permits sufficiently clear image resolution and detail for data recovery at the receiving end.


The received FTD image 940 is obtained by the receiving party. The received image 940 is processed via a decode and validation process 950 (discussed further below) to yield the original transaction document values 240 and fingerprints 605.


Indirect Transmission Via Image Capture: In an alternative embodiment of the present system and method, FTD image 805 may first be captured in the field by another imaging device, such as a cell phone 905. For example, one of the parties to the point-of-service transaction may employ a cell phone 905 to capture the image of FTD image 805 as shown directly on a display screen 120 of tablet 100. Because cell phone imaging is subject to manual uncertainties (such as imperfect orientation of the cell phone 905 by the user, or hand motion of the cell phone 905 by the user), the captured image 910 may be subject to imperfections. These imperfections may include skewing of the image, partial rotation of the image, slight blurring, or other imperfections.


Captured FTD image 805 may be transmitted by a second transmission process 915.2 (for example, via the cellular network methods employed by a suitable cell phone service provider) from cell phone 905 to the receiving parties. Captured FTD image 805 may be sent in any known conventional document format, such as JPG, PNG, TFF, BMP, GIF, and others well known in the art.


The received FTD image 940 is obtained by the receiving party. The received, captured FTD image 945 will have any imperfections caused by the image capture process on cell phone 905.


In an embodiment of the present system and method, on the receiving end, received capture FTD image 945 may be subject to image-correction processes 947, such as deskewing, rotation, image sharpening, and other appropriate image-processing methods generally known in the art. The result is a corrected received image 940 which is the same, or substantially the same, as the final transaction document image 805.


The corrected received image 940 is further processed via a decode and validation process 950 (discussed further below) to yield the original transaction document values 240 and fingerprints 605.


Method for Transaction Document Encoding into Verifiable Image Format



FIG. 10 presents a flow-chart of an exemplary method 1000 for an electronic device, such as a tablet computer 100, to encode commercial transaction data 240 and visual biometric transaction signatures 605 (such as fingerprints) obtained at a point-of-service sales transaction into a secure, self-verifiable image format 510, 805, or final transaction document image 510, 805.


The exemplary method 1000 may entail some method steps which may be the same or similar to many method steps previously discussed above in this document (see FIGS. 2 through 10 and associated discussions). Therefore, some exemplary details which are discussed at length above in this document are not repeated here. It will be understood by persons skilled in the relevant arts that methods, steps, and processes discussed throughout this document (and associated computer code which may be used in actual implementation), including specifically those of method 1100 as well as other methods above, may be combined or integrated in suitable combinations to achieve the outputs, outcomes and purposes described herein.


The exemplary method 1000 may be performed via an electronic processing device, such as tablet 100, which is used to obtain, store, and process commercial transaction data, such as a sale and payment between two parties. The method 1000 may be performed after transaction data has been entered into, and stored, on tablet 100. The method 1000 may also be performed by a secondary, associated electronic processing device (which also has a processor, memory, and other elements similar to tablet 100), and which is communicatively coupled to tablet 100 to obtain data from tablet 100. For convenience of exposition only, the discussion below assumes method 1000 is performed via tablet 100.


Persons skilled in the relevant arts will recognize that the order of steps in method 1000 is exemplary only, and some steps shown later in method 1000 may be performed before steps which are described herein as earlier in the method 1000.


The method 1000 begins with step 1005. In step 1005, method 1000 extracts unique transaction data 240 and other distinctive transaction data 240 from a transaction document 200. (See FIGS. 2 and 3 above for related discussion.) In an embodiment of the present system and method, step 1005 also entails extracting other remaining document data which is neither unique nor distinctive, but which may be encoded as well in 2-D barcode format. (It will be understood that references herein to “transaction data 240” may include such other remaining, not unique, not distinctive document data.) The transaction data 240, in its original alphanumeric form, and not encrypted, is referred to also as plaintext data.


In step 1010, method 1000 generates a first two-dimensional (2-D) bar code 410.1, such as a matrix code or QR code. The encoding process 405 writes the plaintext transaction data 240 into a graphical form 410.1, typically a matrix or other arrangement of black-and-white cells or bars. (See FIG. 4A above for related discussion.)


In step 1015, method 1000 generates a document-specific encryption password 360 which is based on, and is generally unique to, the unique transaction data 240 from document 100. The document-specific encryption password 360 (also referred to herein simply as “password 360”) may be generated according to any number of encryption methods 310 known in the art, or according to a proprietary encryption algorithm 310, as long as encryption algorithm 310 generates the password 360 based upon and unique to the unique transaction data 240. (See FIG. 3 above for related discussion.)


In step 1020, method 1000 generates encrypted transaction data 425. Encrypted transaction data 425 is generated via a specified data encryption algorithm 420, and encrypts transaction data 240 by employing the document-specific password 360 discussed above (possibly along with additional passwords). The output of the step is encrypted, or “ciphertext”, transaction data 425. (See FIG. 4B above for related discussion.)


In step 1025, method 1000 generates a second two-dimensional (2-D) bar code 410.2, such as a matrix code, QR code, or other arrangement of black-and-white cells or bars. This encodes 405 the ciphertext transaction data 425 into a graphical form 410.2. (See FIG. 4B above for related discussion.)


In step 1030, method 1000 generates a merged 2-D barcode 510 which integrates the data from first barcode 410.1 and second barcode 410.2. Merged barcode 510 is created in such a way as to encode all the data from first barcode 410.1, which is the plaintext transaction data 420; and also all the data from second barcode 410.2, which is the ciphertext transaction data 425. (See FIG. 5 above for related discussion.)


In an embodiment, merged barcode 510 is generated by overlapping corresponding cells of first barcode 410.1 and second barcode 410.2, and mapping various overlapped cell combinations to designated colors. Therefore, in an embodiment, merged barcode 510 is a multicolored barcode. A color mapping function, table, or algorithm may be implemented to perform such a mapping, mapping a first cell overlapping a second cell to a specific color. An exemplary code sample may be, for example and without limitation:


OriginalCell_1_Color(m, n)=a;


OriginalCell_2_Color(m, n)=b;


OriginalCell_1_Color(m, n)+


OriginalCell_2_Color(m, n)→FinalCellColor(a, b),


where ‘m’ and ‘n’ are row and column values; while ‘a’ and ‘b’ may take values of ‘0’ or ‘1’ for “Black” and “White” respectively. The plus (+) operator indicates overlapping the first cell over the second cell; and a table FinalCellColor(a, b) may be defined with values such as Green, Yellow, Red, Blue depending on the specific values (0 or 1) of a and b. For example, FinalCellColor (0, 0) may be defined as “Yellow”, and so maps a black-on-black cell combination to the color yellow for a single combined cell.


Other similar mappings, including mappings which allow for inclusion of alternative or additional colors, may be envisioned as well. (See again FIG. 5 above for related discussion.)


In an embodiment of the present system and method, method 1000 stops after completion of step 1030, the method having generated the multi-colored merged barcode 510, which may also serve as the Final Transaction Document (FTD) image 805.


In an alternative embodiment, transaction document 200 includes, or has associated with it, graphical biometric signatures 605 which represent signatures of the human parties to the transaction. Graphical biometric signatures may be fingerprints, hand prints, iris scans, retinal scans, facial images, or similar. In such embodiments, method 1000 may continue (after step 1030) with step 1035.


In step 1035, exemplary method 1000 generates a unique, document-specific image shuffling sequence 365 based on the unique transaction data 240. Image shuffling sequence 365 maps a set of digits, typically starting at 1 and incrementing by 1, back onto itself, but in a different order. The order mapping is an indicator of how sections 607 of a biometric image 605 may be spatially re-ordered to create shuffled biometric image 615. The digits may be shuffled or re-ordered according to any of a variety of shuffling methods 350, as long as shuffling algorithm 350 generates the shuffling sequence 365 both based upon and unique to the unique transaction data 240. (See FIGS. 3 and 6 above for related discussion.)


In step 1040, method 1000 obtains the graphical biometric signatures 605 of the transaction parties, for example the fingerprints 605 of a buyer and a seller. The graphical biometric signatures 605 are typically images in such formats as GIF, JPG, BMP, and other graphical formats known in the art. (See FIG. 6 and other figures above for related discussions.)


In step 1045, method 1000 divides each of the biometric images 605 (such as fingerprints 605) into multiple image segments 607, typically square or rectangular in shape. The image segments 607 are mutually non-overlapping, but together the image segments 607 may be spatially arranged in their original order of spatial relations and original orientations to reconstruct each biometric image 605. In an embodiment, the segments 607 are labeled/numbered with sequential digits, in order to be shuffled according to image shuffling sequence 365. (See FIG. 6 above for related discussion.)


In step 1050, method 1000 spatially rearranges each of the first biometric image 605.1 and the second biometric image 605.2, to create respective, shuffled biometric images 615.1, 615.2. The shuffling is done according to the order indicated by the mapping in the document-specific image shuffling sequence 365. (See again FIG. 6 above for related discussion.)


In step 1055, method 1000 generates a merged shuffled biometric signature 712. Merged biometric signature 712 is created in such a way as to encode substantially all the image data from first shuffled biometric image 615.1 and from second shuffled biometric image 615.2. (See FIG. 7 above for related discussion.)


In an embodiment, merged shuffled biometric signature 712 is generated by overlapping corresponding image pixels of first shuffled biometric image 615.1 and second shuffled biometric image 615.2, and mapping various overlapped pixel combinations to designated colors. Therefore, in an embodiment, merged shuffled biometric signature 712 is a multicolored image. As with merged barcode 510 discussed above, a color mapping function, table, or algorithm may be implemented to perform such a mapping, mapping a first pixel overlapping a second pixel to a specific final color pixel. (See again FIG. 7 above for related discussion.)


In step 1060, method 1000 forms a combined document image 805 which may be referred to as the Final Transaction Document (FTD) image 805, and which spatially concatenates merged 2-D barcode image 510 with merged shuffled biometric image 712. In an embodiment, the two images are concatenated by arranging them spatially side-by-side in one image. (See FIG. 8 above for related discussion.)


Received Data: Data Retrieval and Validation


In an embodiment of the present system and method, the FTD image 805 is transmitted to receiving parties, which may for example be business offices associated with the point-of-transaction seller and/or buyer. (See FIG. 9 above for related discussion.)


The transaction data encoded in the FTD image 805 can be extracted at the receiving end. In addition, data validation may be performed if there are any controversies between sender and receiver.



FIG. 11 illustrates an exemplary method 1100 for data recovery and validation by a receiving party. Method 1100 may be performed on the receiving end by any processing device, such as a tablet 100 or other computing device equipped with a suitable processor, memory, and computer code configured to implement the steps of exemplary method 1100. Method 1100 may also be performed on the sending end of the transaction as well (for example, for pre-validation before sending).


In step 1105 the method 1100 extracts from FTD image 805 both of:

    • the merged 4-color 2-D barcode 510; and
    • the merged shuffled fingerprint image 712.


In an embodiment, extraction of the two images 510, 712 involves a straightforward spatial parsing of the two images, as the two images are placed spatially side-by-side and are non-overlapping in FTD image 805.


In step 1110, method 1100 extracts, from merged 2-D barcode 510, each of the first 2-D barcode 410.1 with the plaintext data 240, and also the second 2-D barcode 410.2 with the ciphertext data 425. In an embodiment, extracting the two barcodes 410 is performed by reversing the method 500 described above in association with FIG. 5.


Consider for example any cell of merged 2-D barcode 510 at a cell coordinate (X, Y). The cell 510 will have a specific color. The corresponding cells at (X, Y) coordinates of first 2-D barcode 410.1 and second 2-D barcode 410.2 are restored by determining the overlapped cell combinations 515 which resulted in the output color.


Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to distinguish plaintext 2-D barcode 410.1 and ciphertext 2-D barcode 410.2 from a merged, 2-D barcode image 510:














    {// Merged cell color is WHITE


    if Merged_QRCode_Image(x,y).PixelColor == “WHITE”


Then


    Actual_Invoice_QRCode(x,y).PixelColor = “WHITE” AND


Encrypted_Invoice_QRCode(x,y).PixelColor = “WHITE”;


    // Merged cell color is BLACK


    if Merged_QRCode_Image (x,y).PixelColor == “BLACK”


Then


    Actual_Invoice_QRCode(x,y).PixelColor = “BLACK” AND


Encrypted_Invoice_QRCode(x,y).PixelColor = “BLACK”;


    // Merged cell color is RED


    if Merged_QRCode_Image (x,y).PixelColor == “RED” Then


    Actual_Invoice_QRCode(x,y).PixelColor = “WHITE” AND


Encrypted_Invoice_QRCode(x,y).PixelColor = “BLACK”;


    // Merged cell color is BLUE


    if Merged_QRCode_Image (x,y).PixelColor == “BLUE”


Then


    Actual_Invoice_QRCode(x,y).PixelColor = “BLACK” AND


Encrypted_Invoice_QRCode(x,y).PixelColor = “WHITE”;


    }









Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other color combinations may be employed within the scope and spirit of the present system and method.


In step 1115, method 1100 extracts a first set of plaintext unique transaction document data (labeled herein as 240.1, though not specifically shown in FIG. 11) from the recovered first 2-D barcode 410.1. This first set of plaintext transaction data 240.1 is recovered by extracting data from first 2-D barcode 410.1 according to a suitable barcode data extraction method known in the art for the specific 2-D barcode technology employed.


In step 1120, method 1100 generates an encryption password 360. The encryption password 360 is generated from the first set of unique plaintext transaction data 240.1, and is the same encryption password 360 as generated originally to create encrypted data 425 at the point-of-sale. (See FIG. 4B above for related discussion.)


In step 1125, method 1100 extracts the ciphertext transaction document data 425 from the recovered second 2-D barcode 410.2. Ciphertext data 425 is recovered by extracting data from second 2-D barcode 410.2 according to a suitable barcode data extraction method known in the art for the specific 2-D barcode technology employed.


In step 1130, method 1100 decrypts ciphertext data 425 by applying the encryption password 360 (which was generated in step 1120) to ciphertext data 425 according to a suitable decryption algorithm; the suitable decryption algorithm is one designed to decrypt ciphertext generated in step 1020 of method 1000 above via encryption algorithm 420. (See also FIG. 4B above for related discussion.) The output of the decryption process is a second set of plaintext unique transaction document data (labeled herein as 240.2, though not specifically shown in FIG. 11).


Summarizing to this point, a first set of plaintext unique transaction document data 240.1 has been extracted from first 2-D barcode 410.1, while a second set of plaintext unique transaction document data 240.2 has been recovered from second 2-D barcode 410.2


In step 1135, method 1100 compares first set of plaintext transaction document data 240.1 with second set of plaintext transaction document data 240.2. If document and data integrity has been maintained, the two sets of plaintext data 240.1, 240.2 should be the same.


If the comparison of step 1135 determines that first set of plaintext transaction document data 240.1 and second set of plaintext transaction document data 240.2 are not the same, then in step 1140.1 method 1100 determines that the retrieved data is invalid or corrupted, or that in some other way data integrity has not been maintained. The method may stop at this point. In an alternative embodiment (not illustrated in the flow chart for FIG. 11), and in spite of the loss of data integrity, the method 1100 may continue with step 1145 and steps successive therefrom.


If the comparison of step 1135 determines that first set of plaintext transaction document data 240.1 and second set of plaintext transaction document data 240.2 are the same, then in step 1140.1 method 1100 determines that the retrieved data is valid, that is, that data integrity has been maintained.


In an alternative embodiment (not shown in FIG. 11), the present system and method does not compare plaintext transaction document data 240.1 with plaintext transaction document data 240.2. Instead, the method 1100 may first encrypt plaintext transaction document data by applying the encryption document data. The method 1100 then compares this newly encrypted transaction document data with the encrypted transaction document data extracted from recovered second 2-D barcode 410.2. If the two sets of encrypted data 425 are equal, data integrity has been maintained. If the two sets of encrypted data 425 are not equal, data integrity may be determined to have been lost.


In an embodiment of the present system and method, method 1100 may stop at this point. In an alternative embodiment, method 1100 continues with step 1145.


In step 1145, method 1100 generates the unique image shuffling sequence 365 from the first set of plaintext unique transaction document data 240.1. The unique image shuffling sequence 365 is the same as that generated originally to create shuffled biometric images 615 at the point-of-sale. (See FIGS. 4B and 6 above for related discussion. See also step 1035 of method 1000, discussed above in conjunction with FIG. 10.)


In step 1150, method 1100 generates an inverse image shuffling sequence, which is the inverse of image shuffling sequence 365. In an embodiment, the inverse image shuffling sequence is simply a reverse of the mapping of image shuffling sequence 365, and can be used to restore a shuffled biometric image 615 to the corresponding original biometric image 605 (such as an original, unshuffled fingerprint image 605).


In step 1155, method 1100 extracts, from merged shuffled biometric image signature 712, the first shuffled biometric image 615.1 and the second shuffled biometric image 615.2. These images may for example be the shuffled images of the transaction-participant fingerprints 605 originally obtained at the point of service.


In an embodiment, extracting the two shuffled biometric image 615 is performed by reversing the method described above in association with FIG. 7.


Consider for example any pixel of merged 2-D barcode 712 at a cell coordinate (X, Y). The pixel 712 will have a specific color. The corresponding pixels at (X, Y) coordinates of first shuffled biometric image 615.1 and second shuffled biometric image 615.2 are restored by determining the overlapped pixel combinations 715 which resulted in the output (merged) pixel color.


Presented here is exemplary code (which may for example be in C, C++, Java, or a similar programming language, or may be pseudocode which can be implemented in specific form in any number of known programming languages) of a kind which may be employed to extract first shuffled biometric image 615.1 and second shuffled biometric image 615.2 from a merged shuffled biometric image 712:














    {// Merged pixel color is WHITE


    if Merged_Biometric_Image(x,y).PixelColor == “WHITE”


Then


    {First_Biometric_Image(x,y).PixelColor = “BLACK” AND


Second_Biometric_Image(x,y).PixelColor = “BLACK”};


    // Merged pixel color is BLACK


if Merged_Biometric_Image(x,y).PixelColor == “BLACK”


Then


    {First_Biometric_Image(x,y).PixelColor = “WHITE” AND


Second_Biometric_Image(x,y).PixelColor = “WHITE”};


    // Merged cell color is BLUE


    if Merged_Biometric_Image(x,y).PixelColor == “BLUE”


Then


    {First_Biometric_Image(x,y).PixelColor = “WHITE” AND


Second_Biometric_Image(x,y).PixelColor = “BLACK”};


    // Merged cell color is RED


    if Merged_Biometric_Image(x,y).PixelColor == “RED”


Then


    {First_Biometric_Image(x,y).PixelColor = “BLACK” AND


Second_Biometric_Image(x,y).PixelColor = “WHITE”};


    }









Persons skilled in the relevant arts will recognize that the above code sample is exemplary only, and other code and other pixel color combinations may be employed within the scope and spirit of the present system and method.


In step 1160, method 1100 reverse-shuffles each of the first shuffled biometric image 615.1 and the second shuffled biometric image 615.2 (from step 1155), both according to the inverse image shuffling sequence generated in step 1150. The result is a respective 1st unshuffled biometric image signature 615.1 (for example, an image recognizable as a first fingerprint) and a 2nd unshuffled biometric image signature 615.2 (for example, an image recognizable as a second fingerprint).


In step 1165, method 1100 obtains, from a suitable employee or personnel database, a stored biometric image (such as a fingerprint) which is expected for one or both parties to the sales transaction. In an alternative embodiment, step 1165 may obtain one or both validation fingerprint images of expected human parties to the transaction via live, real-time scanning of their fingerprints with a local or remote fingerprint scanner. The expected fingerprints, of the persons who should be the signatories to the transaction, may be referred to herein as the “validation biometric signatures” or equivalently as the “stored/live biometric signatures”. (“Image” may also be used in place of “signature”.)


In step 1170, the method 1100 may compare the first unshuffled biometric image 615.1 with a stored/live biometric image for a first party to the sales transaction. If the two images compare favorably (that is, are determined to be biometric images of the same person, such as fingerprints of the same person), then in step 1175.1 the first signature on the electronic document is considered to be verified as valid. If the two images do not compare favorably (that is, are determined to not be biometric images of the same person, or may not be biometric images of the same person), then in step 1175.2 the first signature on the electronic document is considered to be not verified or invalid.


Similarly, and also in step 1170, the method 1100 may compare the second unshuffled biometric image 615.2 with a stored/live biometric image for a second party to the sales transaction. If the two images compare favorably (are determined to be biometric images of the same person, such as fingerprints of the same person), then in step 1175.1 the second signature on the electronic document is considered to be verified as valid. If the two images do not compare favorably (that is, are determined to not be biometric images of the same person, or may not be biometric images of the same person), then in step 1175.2 the second signature on the electronic document is considered to be not verified or invalid.


Note: In an embodiment, for fingerprint comparisons and validation, the present system and method may employ proprietary code. In an alternative embodiment, the present system and method may also employ open source code for fingerprint comparisons. See for example:


https://sourceforge.net/projects/sourceafis/


Additional Considerations and Embodiments


The present system and method supports the collection, recording, encoding and encryption, transmission, and validation of commercial transaction data (such as bills of sales and sales receipts) at a point-of-sale and at other locations. These tasks are accomplished entirely electronically/digitally, via an electronic processing device such as a tablet 100, without the use of paper forms. However, the present system and method still retains many of the benefits as paper forms, along with additional advantages.


1. Paper Bills Cannot be Tampered with Easily.


The present system and method identifies if any of the files have been tampered with or modified by extracting the actual (plaintext) data 2-D barcode 410.1 and the encrypted (ciphertext) data 2-D barcode 410.2. If the encrypted data 2-D barcode 410.2 was not decrypted properly or not matched with the actual transaction data 240, then the present system and method may consider that the data files have been tampered with, modified, or corrupted.


2. Easily Detectable Forgery Signatures.


Hand-written signatures may be easily forged, but fingerprints 605 and other graphical biometric signatures 605 cannot be readily forged. Further, by virtue of the fingerprint shuffle sequence 365 and the merging of fingerprint images into a combined shuffled fingerprint image 712, the original fingerprint image can only be reconstructed by parties in possession of the proprietary shuffling and merging algorithms employed.


Other Features


In various embodiments, the present system and method offers these features:

    • The electronic bill (e-bill) can be captured and transmitted by either transaction party by taking a picture of the e-bill via a mobile device or camera, such as a cell phone 905.
    • A merged, multi-color fingerprint image 712 is employed for e-Bill file authentication. In an embodiment, the method employs a color-coding scheme to merge two separate fingerprints 712.
    • An e-bill-specific encryption password 360 and an e-bill-specific fingerprint shuffling sequence 360 are uniquely generated from critical, transaction-specific data.
    • The actual data 2-D barcode 410.1 and the encrypted data 2-D barcode 410.2 are overlapped via a color-coding scheme to create a multi-colored 2-D barcode 510 for storing invoice data 240.


Additional Embodiments


The present system and method may be implemented in any workflow applications as a licensed feature, and so provide the present system and method as integrated workflow functionality. The present system and method may be offered as a complete functionality as a software service or in a software development kit (SDK) to incorporate the present system and method in business-to-business (B2B) customer applications.


In various embodiments, the present system and method may be employed for Direct Store Delivery (DSD) applications and also for Less than Truck Load (LTL) applications to completely avoid paper bills.


In various embodiments, the present system and method may include separate software modules or systems to create and/or read paper free bills.


In embodiments of the present system and method, potential benefits and cost-savings compared to paper billing may include:

    • Reduced physical efforts for carrying and handling of papers, scanners, and printers;
    • Effort of scanning and power charging of printers will be avoided;
    • Avoid the costs of papers, scanners, and printers;
    • Avoid costs of maintaining safe storage facilities for hardcopy bills which must be stored for many years;
    • Time required for searching the required bills from paper storage may be saved;
    • Costs of digital signature licenses and maintenance may be avoided;
    • Time of scanning, printing, and loading papers to printers may be eliminated;
    • Having wired or wireless connections is not mandatory to transfer the e-bill; just taking the e-bill photograph is enough to share the file.


* * *

To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;
  • U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;
  • U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;
  • U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • U.S. Pat. No. 8,740,082; U.S. Pat. No. 8,740,085;
  • U.S. Pat. No. 8,746,563; U.S. Pat. No. 8,750,445;
  • U.S. Pat. No. 8,752,766; U.S. Pat. No. 8,756,059;
  • U.S. Pat. No. 8,757,495; U.S. Pat. No. 8,760,563;
  • U.S. Pat. No. 8,763,909; U.S. Pat. No. 8,777,108;
  • U.S. Pat. No. 8,777,109; U.S. Pat. No. 8,779,898;
  • U.S. Pat. No. 8,781,520; U.S. Pat. No. 8,783,573;
  • U.S. Pat. No. 8,789,757; U.S. Pat. No. 8,789,758;
  • U.S. Pat. No. 8,789,759; U.S. Pat. No. 8,794,520;
  • U.S. Pat. No. 8,794,522; U.S. Pat. No. 8,794,525;
  • U.S. Pat. No. 8,794,526; U.S. Pat. No. 8,798,367;
  • U.S. Pat. No. 8,807,431; U.S. Pat. No. 8,807,432;
  • U.S. Pat. No. 8,820,630; U.S. Pat. No. 8,822,848;
  • U.S. Pat. No. 8,824,692; U.S. Pat. No. 8,824,696;
  • U.S. Pat. No. 8,842,849; U.S. Pat. No. 8,844,822;
  • U.S. Pat. No. 8,844,823; U.S. Pat. No. 8,849,019;
  • U.S. Pat. No. 8,851,383; U.S. Pat. No. 8,854,633;
  • U.S. Pat. No. 8,866,963; U.S. Pat. No. 8,868,421;
  • U.S. Pat. No. 8,868,519; U.S. Pat. No. 8,868,802;
  • U.S. Pat. No. 8,868,803; U.S. Pat. No. 8,870,074;
  • U.S. Pat. No. 8,879,639; U.S. Pat. No. 8,880,426;
  • U.S. Pat. No. 8,881,983; U.S. Pat. No. 8,881,987;
  • U.S. Pat. No. 8,903,172; U.S. Pat. No. 8,908,995;
  • U.S. Pat. No. 8,910,870; U.S. Pat. No. 8,910,875;
  • U.S. Pat. No. 8,914,290; U.S. Pat. No. 8,914,788;
  • U.S. Pat. No. 8,915,439; U.S. Pat. No. 8,915,444;
  • U.S. Pat. No. 8,916,789; U.S. Pat. No. 8,918,250;
  • U.S. Pat. No. 8,918,564; U.S. Pat. No. 8,925,818;
  • U.S. Pat. No. 8,939,374; U.S. Pat. No. 8,942,480;
  • U.S. Pat. No. 8,944,313; U.S. Pat. No. 8,944,327;
  • U.S. Pat. No. 8,944,332; U.S. Pat. No. 8,950,678;
  • U.S. Pat. No. 8,967,468; U.S. Pat. No. 8,971,346;
  • U.S. Pat. No. 8,976,030; U.S. Pat. No. 8,976,368;
  • U.S. Pat. No. 8,978,981; U.S. Pat. No. 8,978,983;
  • U.S. Pat. No. 8,978,984; U.S. Pat. No. 8,985,456;
  • U.S. Pat. No. 8,985,457; U.S. Pat. No. 8,985,459;
  • U.S. Pat. No. 8,985,461; U.S. Pat. No. 8,988,578;
  • U.S. Pat. No. 8,988,590; U.S. Pat. No. 8,991,704;
  • U.S. Pat. No. 8,996,194; U.S. Pat. No. 8,996,384;
  • U.S. Pat. No. 9,002,641; U.S. Pat. No. 9,007,368;
  • U.S. Pat. No. 9,010,641; U.S. Pat. No. 9,015,513;
  • U.S. Pat. No. 9,016,576; U.S. Pat. No. 9,022,288;
  • U.S. Pat. No. 9,030,964; U.S. Pat. No. 9,033,240;
  • U.S. Pat. No. 9,033,242; U.S. Pat. No. 9,036,054;
  • U.S. Pat. No. 9,037,344; U.S. Pat. No. 9,038,911;
  • U.S. Pat. No. 9,038,915; U.S. Pat. No. 9,047,098;
  • U.S. Pat. No. 9,047,359; U.S. Pat. No. 9,047,420;
  • U.S. Pat. No. 9,047,525; U.S. Pat. No. 9,047,531;
  • U.S. Pat. No. 9,053,055; U.S. Pat. No. 9,053,378;
  • U.S. Pat. No. 9,053,380; U.S. Pat. No. 9,058,526;
  • U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,167;
  • U.S. Pat. No. 9,064,168; U.S. Pat. No. 9,064,254;
  • U.S. Pat. No. 9,066,032; U.S. Pat. No. 9,070,032;
  • U.S. Design Patent No. D716,285;
  • U.S. Design Patent No. D723,560;
  • U.S. Design Patent No. D730,357;
  • U.S. Design Patent No. D730,901;
  • U.S. Design Patent No. D730,902;
  • U.S. Design Patent No. D733,112;
  • U.S. Design Patent No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);


U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);

  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);


U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);

  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);


U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);

  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


* * *

In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.


In the description above, a flow charted technique may be described in a series of sequential actions. Unless expressly stated to the contrary, the sequence of the actions and the party performing the actions may be freely changed without departing from the scope of the teachings. Actions may be added, deleted, or altered in several ways. Similarly, the actions may be re-ordered or looped (repeated). Further, although processes, methods, algorithms or the like may be described in a sequential order, such processes, methods, algorithms, or any combination thereof may be operable to be performed in alternative orders. Further, some actions within a process, method, or algorithm may be performed simultaneously during at least a point in time (e.g., actions performed in parallel), can also be performed in whole, in part, or any combination thereof.


Further, in some embodiments, certain method decision steps or branching points discussed herein may be eliminated within the scope and spirit of the present system and method; still further; additional options, alternative outcomes, or entire additional decision or branching points may be added within the scope and spirit of the present system and method.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of features is not necessarily limited only to those features but may include other features not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive-or and not to an exclusive-or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

Claims
  • 1. A system, comprising: a biometric data collection module;a wireless module;a memory;a processor communicatively coupled to the biometric data collection module, the wireless module and the memory, wherein the processor is configured to:generate, based upon a unique data from an invoice for a transaction, a unique password for data encryption;encrypt the unique data with the unique password to generate ciphertext data;generate a first monochrome two-dimensional bar code (first 2-D barcode) representing the plaintext unique data;generate a second monochrome two-dimensional barcode (second 2-D barcode) representing the ciphertext unique data; andmerge the first 2-D barcode and the second 2-D barcode into a merged multicolored 2-D barcode to obtain a final invoice image, wherein:each color of the multicolored 2-D barcode represents a designated pairing of a first color or a second color of the first 2-D barcode with a third color or a fourth color of the second 2-D barcode.
  • 2. The system according to claim 1, wherein the processor is further configured to validate said final invoice image, whereby the integrity of the encoded data is determined.
  • 3. The system according to claim 2, wherein the processor is further configured to: extract a plaintext data 2-D barcode and a ciphertext data 2-D barcode from the merged multicolored 2-D barcode of the final invoice image;restore the plaintext data from the plaintext data 2-D barcode;restore the ciphertext data from the ciphertext data 2-D barcode;determine if the plaintext data restored from the plaintext data 2-D barcode matches the ciphertext data restored from the ciphertext data 2-D barcode;upon determining that the plaintext data restored from the plaintext data 2-D barcode matches the ciphertext data restored from the ciphertext data 2-D barcode, determine that data integrity has been maintained; andupon determining that the plaintext data restored from the plaintext data 2-D barcode does not match the ciphertext data restored from the ciphertext data 2-D barcode, determine that data integrity has been lost.
  • 4. The system according to claim 3, wherein the processor is further configured to: generate, via the data restored from the plaintext 2-D barcode, the unique password for data encryption; anddecrypt, via the unique password, the ciphertext data restored from the ciphertext data 2-D barcode to recover an unencrypted data.
  • 5. The system according to claim 4, wherein the processor is further configured to: extract a plaintext data 2-D barcode and a first ciphertext data 2-D barcode from the merged multicolored 2-D barcode of the final invoice image;restore the plaintext data from the plaintext data 2-D barcode;generate, via the plaintext data restored from the actual 2-D barcode, the unique password for data encryption;construct a second ciphertext data 2-D barcode based on the plaintext data restored from the plaintext 2-D barcode and the unique password;determine if the first ciphertext 2-D barcode and the second ciphertext 2-D barcode match;upon determining that the first ciphertext 2-D barcode and the second ciphertext 2-D barcode match, determine that data integrity has been maintained; andupon determining that the first ciphertext 2-D barcode and the second ciphertext 2-D barcode do not match, determine that data integrity has been lost.
  • 6. The system according to claim 5, wherein the processor is further configured to generate the unique password for data encryption based upon the unique data by generating the unique password via a proprietary password generation algorithm.
  • 7. The system according to claim 6, wherein the processor is further configured to: generate a unique image shuffling sequence based upon the unique data;obtain a first biometric image and a second biometric image of a respective first and second party to the invoiced transaction;divide each of the first biometric image and the second biometric image into a respective plurality of first image parts and second image parts;spatially shuffle each of the first plurality of image parts and the second plurality of image parts to generate a respective first shuffled biometric image and a second shuffled biometric image, wherein said spatial shuffling is in accordance with the unique image shuffling sequence;merge the first shuffled biometric image and the second shuffled biometric image into a combined shuffled biometric image; andconcatenate the merged multicolored 2-D barcode and the combined shuffled biometric image to form the final invoice image.
  • 8. The system according to claim 7, wherein the processor is further configured to generate the image shuffling sequence via a proprietary data shuffling algorithm.
  • 9. The system according to claim 8, wherein the processor is further configured to: validate said final invoice image, whereby the integrity of the encoded shuffled biometric data is determined.
  • 10. The system according to claim 9, wherein the processor is further configured to: separate the final invoice image into two separate images to recover a retrieved merged multicolored 2-D barcode and a retrieved combined shuffled biometric image;extract a plaintext data 2-D barcode from the retrieved merged multicolored 2-D barcode;restore the unique plaintext data from the plaintext data 2-D barcode;generate, based upon the unique data restored from the plaintext 2-D barcode, the unique fingerprint shuffling sequence;recover from the combined shuffled biometric image the first shuffled biometric image and the second shuffled biometric image; anddeshuffle, based on the image shuffling sequence, at least one of the first biometric image and the second biometric image.
  • 11. A method, comprising: generating a unique password for data encryption;encrypting unique data with the unique password to generate ciphertext data;generating a first monochrome two-dimensional bar code (first 2-D barcode) representing the plaintext unique data;generating a second monochrome two-dimensional barcode (second 2-D barcode) representing the ciphertext unique data; andmerging the first 2-D barcode and the second 2-D barcode into a merged multicolored 2-D barcode to obtain a final image, comprising: overlapping corresponding cells of the first 2-D barcode and the second 2-D barcode.
  • 12. The method of claim 11, wherein each color of the multicolored 2-D barcode represents a designated pairing of a first color or a second color of the first 2-D barcode with a third color or a fourth color of the second 2-D barcode.
  • 13. The method of claim 12, wherein said validating comprises: extracting a plaintext data 2-D barcode and a first ciphertext data 2-D barcode from the merged multicolored 2-D barcode of the final image;restoring the plaintext data from the plaintext data 2-D barcode;generating, via the plaintext data restored from the actual 2-D barcode, the unique password for data encryption;constructing a second ciphertext data 2-D barcode based on the plaintext data restored from the plaintext 2-D barcode and the unique password;determining if the first ciphertext 2-D barcode and the second ciphertext 2-D barcode match;upon determining that the first ciphertext 2-D barcode and the second ciphertext 2-D barcode match, determining that data integrity has been maintained; andupon determining that the first ciphertext 2-D barcode and the second ciphertext 2-D barcode do not match, determining that data integrity has been lost.
  • 14. The method of claim 11, wherein said validating comprises: validating said final image, whereby the integrity of the encoded data is determined, wherein said validating comprises:extracting a plaintext data 2-D barcode and a ciphertext data 2-D barcode from the merged multicolored 2-D barcode of the final image;restoring the plaintext data from the plaintext data 2-D barcode;restoring the ciphertext data from the ciphertext data 2-D barcode;determining if the plaintext data restored from the plaintext data 2-D barcode matches the ciphertext data restored from the ciphertext data 2-D barcode;upon determining that the plaintext data restored from the plaintext data 2-D barcode matches the ciphertext data restored from the ciphertext data 2-D barcode, determining that data integrity has been maintained; andupon determining that the plaintext data restored from the plaintext data 2-D barcode does not match the ciphertext data restored from the ciphertext data 2-D barcode, determining that data integrity has been lost.
  • 15. The method of claim 14, wherein determining if the plaintext data restored from the plaintext data 2-D barcode matches the ciphertext data restored from the ciphertext data 2-D barcode comprises: generating, via the data restored from the plaintext 2-D barcode, the unique password for data encryption; anddecrypting, via the unique password, the ciphertext data restored from the ciphertext data 2-D barcode to recover an unencrypted data.
  • 16. The method of claim 11, wherein generating the unique password for data encryption based upon the unique data comprises generating the unique password via a proprietary password generation algorithm.
  • 17. The method of claim 11, further comprising: generating a unique image shuffling sequence based upon the unique data;obtaining a first biometric image and a second biometric image of a respective first and second party to the invoiced transaction;dividing each of the first biometric image and the second biometric image into a respective plurality of first image parts and second image parts;spatially shuffling each of the first plurality of image parts and the second plurality of image parts to generate a respective first shuffled biometric image and a second shuffled biometric image, wherein said spatial shuffling is in accordance with the unique image shuffling sequence;merging the first shuffled biometric image and the second shuffled biometric image into a combined shuffled biometric image; andconcatenating the merged multicolored 2-D barcode and the combined shuffled biometric image to form the final image.
  • 18. The method of claim 17, wherein generating the unique image shuffling sequence based upon the unique data comprises generating the image shuffling sequence via a proprietary data shuffling algorithm.
  • 19. The method of claim 17, further comprising: validating said final image, whereby the integrity of the encoded shuffled biometric data is determined.
  • 20. The method of claim 19, wherein said validating comprises: separating the final image into two separate images to recover a retrieved merged multicolored 2-D barcode and a retrieved combined shuffled biometric image;extracting a plaintext data 2-D barcode from the retrieved merged multicolored 2-D barcode;restoring the unique plaintext data from the plaintext data 2-D barcode;generating, based upon the unique data restored from the plaintext 2-D barcode, the unique fingerprint shuffling sequence; recovering from the combined shuffled biometric image the first shuffled biometric image and the second shuffled biometric image; anddeshuffling, based on the image shuffling sequence, at least one of the first biometric image and the second biometric image.
Priority Claims (1)
Number Date Country Kind
201711020273 Jun 2017 IN national
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/688,133, filed Aug. 28, 2017, which claims priority to and the benefit of Indian Patent Application No. 201711020273, filed Jun. 9, 2017, the entire contents of which are incorporated herein by reference.

US Referenced Citations (718)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
D559261 Jung et al. Jan 2008 S
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
7783893 Gorelik et al. Aug 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Batten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Batten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9165714 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Wang Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9443123 Hejl Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9261398 Amundsen et al. Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9262664 Soule, III et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274806 Batten Mar 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9282501 Wang et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9292969 Laffargue et al. Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298667 Caballero Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey May 2016 B2
9342827 Smith May 2016 B2
9355294 Smith et al. May 2016 B2
9360304 Xue et al. Jun 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9367722 Xian et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390596 Todeschini Jul 2016 B1
9396375 Qu et al. Jul 2016 B2
9398008 Todeschini et al. Jul 2016 B2
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9405011 Showering Aug 2016 B2
9407840 Wang Aug 2016 B2
9411386 Saurerwein Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418252 Nahill et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van Volkinburg et al. Aug 2016 B2
9423318 Lui et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9448610 Davis et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9582696 Barber et al. Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
9616749 Chamberlin Apr 2017 B2
9618993 Murawski et al. Apr 2017 B2
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
9715614 Todeschini et al. Jul 2017 B2
9734493 Gomez et al. Aug 2017 B2
9923950 DATE NAME Mar 2018 B1
9984366 Jammikunta May 2018 B1
20070063048 Havens et al. Mar 2007 A1
20080185432 Caballero et al. Aug 2008 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168511 Kotlarsky et al. Jul 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120228382 Havens et al. Sep 2012 A1
20120248188 Kearney Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130082104 Kearney et al. Apr 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Vanancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140097249 Gomez et al. Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltselff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140267609 Laffargue Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140278391 Braho et al. Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140312121 Lu et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho et al. Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150039878 Barten Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150060544 Feng et al. Mar 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178534 Jovanovski et al. Jun 2015 A1
20150178535 Bremer et al. Jun 2015 A1
20150178536 Hennick et al. Jun 2015 A1
20150178537 El Akel et al. Jun 2015 A1
20150181093 Zhu et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150310243 Ackley Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150324623 Powilleit Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150373847 DATE NAME Dec 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160055552 Chai et al. Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr et al. Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Linwood Jun 2016 A1
20160188944 Witz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 DiPiazza et al. Jun 2016 A1
20160192051 DiPiazza et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggert et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160316190 McCloskey et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron, Jr. et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Geramine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van Horn et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 d'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Jonas et al. Jul 2017 A1
20170193727 Van Horn et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
Foreign Referenced Citations (4)
Number Date Country
WO 2013163789 Nov 2013 WO
WO 2013173985 Nov 2013 WO
WO 2014019130 Feb 2014 WO
WO 2014110495 Jul 2014 WO
Non-Patent Literature Citations (9)
Entry
Harish et al., “Secured QR-Code Based COD Payment Through Mobile Bill Presentment System Replacing the POS Machine With an Electronic Device”, International Journal of Advance Research in Science and Engineering, vol. No. 5, Issue No. 2, Feb. 2016, www.ijarse.com.
Notice of Allowance for U.S. Appl. No. 15/688,133 dated Feb. 1, 2018.
U.S. Appl. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.).
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.).
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.).
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.).
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.).
U.S. Appl. No. 14/676,109 for Indicia Reader filed Apr. 1, 2015 (Huck).
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.).
Related Publications (1)
Number Date Country
20180357632 A1 Dec 2018 US
Continuations (1)
Number Date Country
Parent 15688133 Aug 2017 US
Child 15968377 US