Controlling access to output device between two processors

Information

  • Patent Grant
  • 11983688
  • Patent Number
    11,983,688
  • Date Filed
    Tuesday, February 8, 2022
    2 years ago
  • Date Issued
    Tuesday, May 14, 2024
    17 days ago
Abstract
A point of sale (POS) device includes an output device such as a speaker, a display screen, or a network interface. The POS device also includes a secure enclosure housing a secure processor and tamper detection circuitry for detecting attempts to tamper with the secure enclosure. Use of the output device is shared between the secure processor and a main processor via a switch that is controlled by the secure processor. The secure processor can switch control of the output device from the main processor to itself and can output an output dataset via the output device in a number of scenarios. These scenarios include the secure processor detecting an attempt to tamper with the secure enclosure, the secure processor recognizing that the main processor is behaving suspiciously, or the secure processor wanting to output sensitive information. The output dataset may include visual data, audio data, or network data.
Description
BACKGROUND

Payment object reading devices are devices that read information from payment objects, such as credit cards. Payment object reading devices typically include circuitry that reads, stores, or conveys sensitive information such as a customer's credit card number or personal identification number (“PIN”). If such circuitry of the payment object reader is left unprotected, a malicious party could potentially retrieve a customer's sensitive information by accessing the circuitry of the payment object reader that reads, stores, or conveys the sensitive information.


A secure enclosure refers to an enclosure or housing that includes tamper detection circuitry integrated into the enclosure or housing itself. Circuitry that is within the secure enclosure is protected or secured, while circuitry that is outside of the secure enclosure is generally unprotected and unsecured. A processor can be within a secure enclosure to protect or secure the processor. Tamper detection circuitry can interface with such a secured processor to help the secured processor identify an attempt by a malicious party to tamper with the secure enclosure.


An output device such as a speaker or a display can be controlled by a processor to output audio or to display visual media, respectively.


There is a need in the art for sharing of output devices between a secured processor and an unsecured processor, for example in a payment object reading device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device circuitry is within a secure enclosure.



FIG. 1B is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device circuitry is outside of a secure enclosure.



FIG. 1C is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device is inside of a secure enclosure.



FIG. 2A is a flow diagram illustrating switching from a first state in which a main processor controls an output device to a second state in which a secure processor controls an output device in response to detection of tampering or a compromised main processor.



FIG. 2B is a flow diagram illustrating switching from a first state in which a main processor controls an output device to a second state in which a secure processor controls an output device in response to receipt of sensitive information at the secure processor.



FIG. 3A is a block diagram illustrating a main processor and a secure processor sharing a speaker output device via an H-bridge.



FIG. 3B is a block diagram illustrating a main processor and a secure processor sharing a headset output device via an H-bridge.



FIG. 4A is a block diagram of a payment object reader device with a display screen, an audio output, a main processor, and a secure processor.



FIG. 4B is a block diagram of the payment object reader device of FIG. 4A with switches added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the switches are within the secure enclosure.



FIG. 4C is a block diagram of the payment object reader device of FIG. 4A with switches added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the switches are outside of the secure enclosure.



FIG. 4D is a block diagram of the payment object reader device of FIG. 4A with H-bridges added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the H-bridges are within the secure enclosure.



FIG. 4E is a block diagram of the payment object reader device of FIG. 4A with H-bridges added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the H-bridges are outside of the secure enclosure.



FIG. 4F is a block diagram of the payment object reader device of FIG. 4B, where the payment object reader is separated from a host device.



FIG. 5A is a circuit diagram of an H-bridge with switches and two control inputs.



FIG. 5B is a circuit diagram of an H-bridge with transistors and four control inputs.



FIG. 6A is a block diagram of an H-bridge ASIC with two control inputs.



FIG. 6B is a block diagram of an H-bridge ASIC with four control inputs.



FIG. 7 is a block diagram of exemplary components that may be present on the circuit board.





DETAILED DESCRIPTION

A point of sale (POS) device includes an output device such as a speaker, a display screen, or a network interface. The POS device also includes a secure enclosure housing a secure processor and tamper detection circuitry for detecting attempts to tamper with the secure enclosure. Use of the output device is shared between the secure processor and a main processor via a switch that is controlled by the secure processor. The secure processor can switch control of the output device from the main processor to itself and can output an output dataset via the output device in a number of scenarios. These scenarios include the secure processor detecting an attempt to tamper with the secure enclosure, the secure processor recognizing that the main processor is behaving suspiciously, or the secure processor wanting to output sensitive information. The output dataset may include visual data, audio data, or network data.



FIG. 1A is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device circuitry is within a secure enclosure. These components may be within a point of sale (POS) device and/or a payment/transaction object reader.


The terms “main processor 110” and “secure processor 120” as used herein should be understood to each include a set of one or more of any type of processor(s), controller(s), microcontroller(s), application specific integrated circuit(s) (ASIC), or combinations thereof. The “main processor 110” and “secure processor 120” may include any circuit board component illustrated or discussed with respect to the “processor(s)/controller(s) 710” or any of the rest of the circuit board components 700 illustrated or discussed with respect to FIG. 7.


The main processor 110 and/or secure processor 120 may run a one or more operating systems such as Google® Android®, Apple® iOS®, Microsoft® Windows®, Google® Chrome OS®, Apple® MacOS®, Microsoft® Windows Phone OS®, a distribution of Linux®, or some combination thereof. The main processor 110 and/or secure processor 120 may include instructions for one or more applications, such as financial applications, point of sale (POS) applications, transit pass applications, or ticketing applications that may send data acquired from transaction object reader circuitry 770 as illustrated and discussed with respect to FIG. 7 to a financial server, credit card server, bank server, transit card server, or ticketing server for processing. These applications may also generate one or more user interfaces, such as a financial user interface, a POS user interface, a transit pass user interface, or a ticketing user interface. In one embodiment, the main processor 110 runs a Google® Android® OS and generates a main user interface via an Android® application that acquires data from the transaction object reader circuitry 770 and optionally via a user interface such as a keyboard, a number pad, touchscreen, or touch-sensitive surface. Such a user interface may be used to receive a user's personal identification number (PIN) code, a user's signature, a user's selection in response to a charity donation request, a user's selection in response to a question asking whether or not the user desires a receipt and/or if the user would like a printed receipt or an electronic receipt sent to the user's electronic device, or identifying information about the user such as a name, physical address, e-mail address, or phone number.


The output device 170 of FIG. 1A may include, for example, a display screen, a touchscreen, a printer, a speaker, a headset interface such as an audio jack, a wireless local area network (WLAN) interface, a 802.xx Wi-Fi interface, an Ethernet interface, a local area network (LAN) interface, a cellular network interface, or some combination thereof. The output device circuitry 150 may include drivers, codecs, controllers, processors, combinations thereof, or any other circuitry used to connect to, control, and/or drive output device 170.


The switch 130 allows either the secure processor 150 or the main processor 110 electrically couple to and thereby control output device circuitry 150 that electrically couples to, controls, and/or drives the output device 170. The switch 130 may be or include at least a transistor, such as a field effect transistor (FET).


The state of the switch 130 is controlled by the secure processor 150 via a control input/pin, such as a gate pin of a transistor. The switch 130 is located inside the secure enclosure 180 to prevent a main processor 110 that is misbehaving, malfunctioning, or compromised by a malicious party from inappropriately taking control of the output device circuitry 150. Alternate embodiments could have the switch 130 located outside of the secure enclosure 180 and/or controlled by the main processor 110, but this would likely be less secure in the event of a misbehaving main processor 110.


The switch 130 of FIG. 1A is illustrated in a first state in which the main processor 110 is electrically coupled through the switch 130 to the output device circuitry 140 and eventually to the output device 170. An arrow is illustrated in FIG. 1A showing how the switch would be toggled from the first state to a second state in which the secure processor 120 is electrically coupled through the switch 130 to the output device circuitry 140 and eventually to the output device 170. The switch 130 in FIG. 1B and FIG. 1C is shown in the second state, with a similar arrow showing how the switch would be toggled from the second state to the first state.


Instruction signals 140 coming from either processor through the switch 130 and to the output device circuitry 150 may be digital signals using control/communication protocols and/or standards such as Inter-Integrated Circuit (I2C), Universal Asynchronous Receiver/Transmitter (UART), Universal Synchronous/Asynchronous Receiver/Transmitter (USART), Serial Peripheral Interfaces (SPI), Universal Serial Bus (U6B), or some combination thereof. Output signals 160 between the output device circuitry 150 and the output device 170 may include analog signals and may be scrambled, encrypted, filtered, use a proprietary or non-standard format, be otherwise difficult to interpret, or some combination thereof. Scrambling may involve sending different portions of information in an unusual order, for example. Generally, output signals 160 are more difficult to interpret than the instruction signals 140 and therefore it is safer for them to be conveyed outside of the secure enclosure 180 than for the instruction signals 140 to be conveyed outside of the secure enclosure 180. The output device circuitry 150 of FIG. 1A provides this added security because it is within the security enclosure 180.


The output device circuitry 150 may include hardware and/or software elements that restrict and/or prevent information from flowing “backwards” through the switch 130 from the from the output device 170 and to the secure processor 120, in case a malicious party attempts to access or alter the secure processor 150 in this manner. The hardware and/or software elements that restrict and/or prevent information from flowing “backwards” through the switch 130 may include diodes, such as isolation diodes.



FIG. 1B is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device circuitry is outside of a secure enclosure.


The switch 130 of FIG. 1B and FIG. 1C is shown in the second state, in which the secure processor 120 is electrically coupled through the switch 130 to the output device circuitry 140 and eventually to the output device 170. An arrow is also illustrated at the switch 130 of FIG. 1B and FIG. 1C showing how the switch 130 would be toggled from the second state to the first state that is illustrated in FIG. 1A.


The output device circuitry 150 of FIG. 1B is outside of the secure enclosure 180 rather than inside the secure enclosure 180, unlike in FIG. 1A where the output device circuitry 150 was inside the secure enclosure 180. In reality, a first subset of the output device circuitry 150 may be outside of the secure enclosure 180 as in FIG. 1B, while a second subset of the output device circuitry 150 may be within the secure enclosure 180 as in FIG. 1A.



FIG. 1C is a block diagram illustrating a main processor and a secure processor sharing an output device via a switch, where output device is inside of a secure enclosure.


The architecture illustrated in FIG. 1C is the most secure in comparison to the architectures illustrated in FIG. 1A or FIG. 1B because the output device 170 itself is located within the secure enclosure 180, and because the output signals 160 are conveyed solely within the secure enclosure 180 as well.


The architecture illustrated in FIG. 1C may also be somewhat restrictive and more difficult and expensive to build, however. Certain output devices 170, by their nature, are best suited to be at least partially located along an exterior surface of a device, such as display screens, touchscreens, speakers, headphone jacks, or printers—for such output devices 170, it may be difficult to place them within the secure enclosure 180. Some of these and other output devices 170 may be moved more internal to the devices in which they are located to make it more feasible to enclose them in the secure enclosure 180, but it is difficult to do so without compromising output quality by, for example, muffling sound from a speaker, muddying visuals from a display screen, or weakening signals from a wireless network transceiver. However, certain output devices 170 might not be affected much by this, such as certain types of wireless network transceivers, such as those that use tamper mesh from tamper detection circuitry as a form of transceiver antenna, or certain types of touchscreen, such as those that use touch-sensitive layer lines as a tamper mesh for the tamper detection circuitry.


The architecture illustrated in FIG. 1C, like architecture illustrated in FIG. 1B, is illustrated with the switch 130 in the second state in which the secure processor 120 is electrically coupled through the switch 130 to the output device circuitry 140 and eventually to the output device 170.


An output-sharing circuit can be made with any combination of features/elements illustrated in and/or discussed with respect to FIGS. 3A-3B, FIGS. 4A-4E, FIGS. 5A-5B, FIGS. 6A-6B, or FIG. 7.



FIG. 2A is a flow diagram illustrating switching from a first state in which a main processor controls an output device to a second state in which a secure processor controls an output device in response to detection of tampering or a compromised main processor.


Step 205 includes transmitting a first output instruction from the main processor 110 to the output device through a switch 130 while the switch 130 is in a first state. This first output instruction may be, for example, an instruction from a transaction application running on the main processor 110 to output a transaction user interface to be used by the main processor to conduct a transaction, such as between a buyer and a merchant.


Such a transaction may involve receiving transaction information at the secure processor 120 from a transaction object such as a credit card via transaction object reader circuitry 770. The transaction may optionally include processing the transaction information at the secure processor 120 by encrypting it, password-protecting it, stripping out certain information, reformatting it, or converting it from one format to another, or some combination thereof, before sending the transaction information from the secure processor 120 to the main processor 110, after processing if applicable. Once the main processor 110 receives the processed transaction information from the secure processor 120, the main processor 110 then sends the processed transaction information from the main processor 110 to a transaction server such as a credit card server or bank server via a wired or wireless network interface, where the transaction server ensures that an appropriate transaction amount is transferred from a buyer account to a merchant account. The transaction user interface may incorporate a number of user interfaces that, for example, can assist the buyer or merchant in identifying/tabulating/totaling purchased items and amounts, instruct the buyer or merchant as to when to swipe or insert or tap or remove a transaction card or other transaction object, or ask the buyer about memberships associated with the merchant, charity donations to give along with the transaction, tip percentages/amounts associated with the transaction, whether the buyer wants a paper/plastic bag and if so what kind, and the like.


Step 210 includes outputting a first output via the output device while the switch 130 is in the first state in which the main processor 110 is electrically coupled through the switch 130 to the output device 170 as discussed with respect to FIGS. 1A-1C. Various format conversions, such as digital to analog, may occur between step 205 and 210, as discussed with respect to the instruction signals 140, output device circuitry 150, and output signals 160 of FIGS. 1A-1C. The output in the example above would be the transaction user interface.


Step 210 may alternately be followed by step 205, step 215, or step 225.


Step 215 includes detecting, at a secure processor 120 and via tamper detection circuitry electrically coupled to the secure processor 120, an attempt to tamper with a secure enclosure.


The tamper detection circuitry can include a variety of different types of sensors and sensing methods. The tamper detection circuitry can use a “tamper mesh,” in which two long conductive tamper trace lines run in parallel and in a zig-zagging or boustrophedonic pattern that covers at least a majority of at least one surface of the secure enclosure 180. The two tamper trace lines are at different voltages, and the tamper detection circuitry includes voltage sensors that detect any changes in voltage along either or both lines. A malicious party attempting to drill into the secure enclosure 180 would likely break at least one of these conductive trace lines, connect the two lines together via the conductive metal of the drill itself, short two portions of the same line together via the conductive metal of the drill itself, or some combination thereof—all of which can be detectable as a voltage fluctuation/change over a predefined voltage change threshold as measured via the voltage sensors. The tamper detection circuitry can include inductive sensors that detect nearby objects that are metal or have conductive properties in response to an inductive sensor measurement exceeding a predefined threshold. The tamper detection circuitry can include capacitive sensors that detect touches to surface(s) of the secure enclosure 180 in response to a capacitive sensor measurement exceeding a predefined threshold, where the surface(s) of the secure enclosure 180 should remain internal and should not be touched. The detection of step 210 may include any of these sensors or any combination thereof.


Step 220 includes toggling the switch 130 from the first state to a second state via the secure processor 120 in response to detecting the attempt to tamper with the secure enclosure. In the second state, the secure processor 120 is electrically coupled through the switch 130 to the output device 170 as discussed with respect to FIGS. 1A-1C.


Step 225 includes detecting, at a secure processor 120, that the main processor 110 is likely to be compromised. This detection may be based on receipt of a warning by the secure processor 120 from security software and/or hardware. For example, such a warning may include an indication of unexpected or unsanctioned network activity from a firewall, an indication of virus detection from an antivirus program, an indication of adware detection from an anti-adware program, an indication of spyware detection from an anti-spyware program, an indication of malware detection from an anti-malware program, or some combination thereof. This detection may additionally or alternatively be based on detection of unusual behavior at the main processor 110, such as if the main processor 110 attempts to output a “spoof” of a user interface normally output through or in conjunction with the secure processor 120. Such as “spoof” user interface might, for example, attempt to simulate an “enter PIN” or “enter signature” user interface that would normally send the resulting PIN or signature from the user to the secure processor 120, but where the “spoof” version would instead collect the PIN or signature from the user at the main processor 110. A malicious party taking over the main processor 110 could then steal sensitive information, such as a PIN or signature, from a user. Therefore, detection of such as “spoof” interface by searching for similarities to legitimate security interfaces would be one way to detect that the main processor 110 is likely to be compromised at step 225.


Step 230 includes toggling the switch 130 from the first state to a second state via the secure processor 120 in response to detecting that the main processor 110 is likely to be compromised.


Step 235 includes transmitting a second output instruction from the secure processor 120 to the output device 170 through the switch 130 while the switch 130 is in the second state. Step 235 here could be preceded by step 220 and/or 230 and can occur in response to either or both of those. The second output instruction here could a warning user interface indicating that the POS device is likely tampered with or compromised based on the detections of step 215 and/or 225.


Step 240 includes outputting a second output via the output device while the switch 130 is in the second state. Various format conversions, such as digital to analog, may occur between step 235 and 240, as discussed with respect to the instruction signals 140, output device circuitry 150, and output signals 160 of FIGS. 1A-1C. The output in the example above would be the warning user interface.



FIG. 2B is a flow diagram illustrating switching from a first state in which a main processor controls an output device to a second state in which a secure processor controls an output device in response to receipt of sensitive information at the secure processor.


Step 250 includes transmitting a first output instruction from the main processor 110 to the output device through a switch 130 while the switch 130 is in a first state. This is similar to step 205 of FIG. 2A, and the same notes apply.


Step 255 includes outputting a first output via the output device while the switch 130 is in the first state. This is similar to step 210 of FIG. 2A, and the same notes apply.


Step 260 includes receiving and accessing sensitive information at the secure processor 120, wherein the main processor 110 lacks access to the sensitive information.


The sensitive information can include different types of data and can come from a different sources/components. The sensitive information may include payment object information, such as a credit or debit card number, expiration date, or security code, or some combination thereof received from transaction object reader circuitry 770. The sensitive information may include personal user financial information, such as a bank account balance, a debt amount, an interest rate, an unpaid bill, a paid bill, or some combination thereof received from a wired and/or wireless network interface. The sensitive information may include a PIN code, signature, or user interface selection from a keypad, keyboard, mouse, touchscreen, or touch-sensitive surface of the POS device, or a touch-sensitive or a memory within the secure enclosure 180.


The sensitive information may be scrambled, encrypted, password-protected, or otherwise difficult to read. The term “access” as used with respect to step 260 thus may refer to read access, indicating that the main processor 110 cannot read and/or decrypt and/or unscramble the sensitive information even though it might be capable of retrieving an encrypted, scrambled, or password-protected copy of the sensitive information. On the other hand, the term “access” as used with respect to step 260 may simply refer to the ability (or lack thereof) of the main processor 110 to retrieve any copy of the sensitive information, encrypyted/scrambled/protected or not.


Step 265 includes toggling the switch 130 from the first state to a second state via the secure processor 120 in response to receiving the sensitive information.


Step 270 includes transmitting a second output instruction from the secure processor 120 to the output device through the switch 130 while the switch 130 is in the second state.


Step 275 includes outputting a second output via the output device while the switch 130 is in the second state, wherein the second output includes the sensitive information. Various format conversions, such as digital to analog, may occur between step 270 and 275, as discussed with respect to the instruction signals 140, output device circuitry 150, and output signals 160 of FIGS. 1A-1C.



FIG. 3A is a block diagram illustrating a main processor and a secure processor sharing a speaker output device via an H-bridge.


The main processor 110 of FIG. 3A includes or is connected to an audio codec 310, which is illustrated in FIG. 3A as an audio codec ASIC separate from the main processor 110, but may be implemented at least partially in the main processor via software, hardware, or some combination thereof. The audio codec 310 includes speaker driver lines that drive a speaker output 330. The audio codec 310 includes headset driver lines that drive a headset output 340.


The secure processor 120 of FIG. 3A is housed within a secure enclosure 180 along with an H-bridge 320. The secure processor 150 controls and/or drives the H-bridge 320, for example via general purpose input/output (GPIO) pins/connectors of the secure processor 120. The H-bridge 320 is connected to the speaker driver lines and/or to the speaker output 330 via resistors R1 and R2 of FIG. 3A. Resistors R1 and R2 may be within the secure enclosure 180 as illustrated in FIG. 3A, or outside of it.



FIG. 3B is a block diagram illustrating a main processor and a secure processor sharing a headset output device via an H-bridge.


The main processor 110 of FIG. 3B includes or is connected to the audio codec 310 as in FIG. 3A.


The secure processor 120 of FIG. 3A is housed within a secure enclosure 180. The H-bridge 320 of FIG. 3B is outside of the security enclosure, unlike in FIG. 3B.


The H-Bridge 320 may be driven with power from the secure processor 120, the main processor 110, a battery within or outside of the secure enclosure 180, some other power source, or some combination thereof. The speaker output 330 may be driven with power from the secure processor 120, the main processor 110, a battery within or outside of the secure enclosure 180, some other power source, or some combination thereof. The headset output 340 may be driven with power from the secure processor 120, the main processor 110, a battery within or outside of the secure enclosure 180, some other power source, or some combination thereof.


An output-sharing circuit can be made with any combination of features/elements illustrated in and/or discussed with respect to FIGS. 1A-1C, FIGS. 3A-3B, FIGS. 4A-4E, FIGS. 5A-5B, FIGS. 6A-6B, or FIG. 7.



FIG. 4A is a block diagram of a payment object reader device with a display, an audio output, a main processor, and a secure processor.


The payment object reader device 400 of FIG. 4A includes a main processor 110 that is electrically coupled to a memory 458, a battery 460, a charge and/or communication connector 452, a network interface 454, an audio output 464 via an audio codec 310 and/or amplifier 462, a display screen 466, touch-sensitive surface circuitry 492, and the secure processor 120 within the secure enclosure 180.


The touch-sensitive surface circuitry 492 may be a touch-sensitive layer that detects touches of a touchscreen whose display screen portion is the display screen 466. Having the touch-sensitive surface circuitry 492 in the secure enclosure 180 and connected to both the secure processor 120 and the main processor 110 allows sensitive user inputs, such as a personal identification number (PIN) code or a password or a personal telephone number, to be routed to the secure processor 120, while non-sensitive user inputs can be routed to the main processor 110.


The secure processor 120 is electrically coupled to a separate battery 474, the touch-sensitive surface circuitry 492, tamper detection circuitry 476 of the secure enclosure 180 itself, one or more magstripe read head(s) 480 for reading magnetic stripe transaction cards, and various transaction object reader circuitry 770. The transaction object reader circuitry 770 of FIG. 4A includes magstripe circuitry 478 corresponding to the magstripe read head(s) 480, an integrated circuit (IC) chip contact block 484 for reading IC chip cards, Europay/Mastercard/Visa (EMV) circuitry 482 corresponding to the IC chip contact block 484, a near field communication (NFC) antenna 488 for receiving information from a NFC-capable transaction object and/or sending information to the NFC-capable transaction object, and NFC receiver and/or transmitter circuitry 486 corresponding to the NFC antenna 488.


The transaction object reader circuitry 770 of FIG. 4A may read transaction information from any type of transaction object discussed with respect to the transaction object reader circuitry 770 of FIG. 7. The transaction information may be formatted, password-protected, and/or encrypted by the secure processor 120, by the various circuitries within the transaction object reader circuitry 770 of FIG. 4A, or some combination thereof. The transaction information may then be sent from the secure processor 120 to the main processor 110, which may then send it to an appropriate transaction server via the network interface 454, which may be a wired or wireless network interface and may include a Bluetooth® transceiver, a Bluetooth® Low Energy® transceiver, a 802.11 Wi-Fi transceiver, a wireless local area network (WLAN) transceiver, an Ethernet transceiver, a local area network (LAN) transceiver, or some combination thereof. The transaction server may be a bank server, a credit card or debit card issuer server, or any other server associated with the type of transaction object.


The audio output 464 may be a speaker output 330, a headset output 340, any other type of audio circuitry discussed with respect to the output device circuitry 750 of FIG. 7, or some combination thereof. The display screen 466 may be any type of display screen discussed with respect to the output device circuitry 750 of FIG. 7.


The charge and/or communication connector 452 may be a port, a plug, or a wireless transceiver, or some combination thereof, and can be used to connect the payment object reader device 400 to a power source. The portable computing device can provide power to the payment object reader device 400. The charge and/or communication port 452 can be, for example, a Universal Serial Bus (USB) port, an Apple® Lightning® port, or a TRS or TRRS audio/microphone jack port. Power coming into the payment object reader device 400 via the charge and/or communication port 452 can power the main processor 110, the secure processor 120, audio output 464, display screen 466, the transaction object reader circuitry 770 of FIG. 4A, the battery 460, the battery 474, or some combination thereof.



FIG. 4B is a block diagram of the payment object reader device of FIG. 4A with switches added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the switches are within the secure enclosure.


In particular, switch 130A of FIG. 4B permits sharing of the audio output 464 between the main processor 110 and the secure processor 120 in a similar fashion to that illustrated in FIGS. 1A-1C. Switch 130B of FIG. 4B permits sharing of the display screen 466 between the main processor 110 and the secure processor 120, also similarly to FIGS. 1A-1C.



FIG. 4C is a block diagram of the payment object reader device of FIG. 4A with switches added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the switches are outside of the secure enclosure.


Like FIG. 4B, switch 130A of FIG. 4C permits sharing of the audio output 464 between the main processor 110 and the secure processor 120, while switch 130B of FIG. 4B permits sharing of the display screen 466 between the main processor 110 and the secure processor 120. The switches 130A/B of FIG. 4C operate similarly to the switch 130 of FIGS. 1A-1C, though the switches 130A/B of FIG. 4C are outside of the secure enclosure 180, generally making their operation less secure than if they were within the secure enclosure 180 as in FIG. 4B.



FIG. 4D is a block diagram of the payment object reader device of FIG. 4A with H-bridges added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the H-bridges are within the secure enclosure.


The H-bridge 320A of FIG. 4D permits sharing of the audio output 464 between the main processor 110 and the secure processor 120 in a similar manner to the H-Bridge 320 of FIG. 3A. The H-bridge 320B of FIG. 4D permits sharing of the display screen 466 between the main processor 110 and the secure processor 120 in a manner that is also similar to the H-Bridge 320 of FIG. 3A.


Resistors R1 and R2 positioned as in FIG. 3A are not illustrated in FIG. 4D but can be present between each H-bridge 320 and the corresponding audio output 464 or display screen 466.



FIG. 4E is a block diagram of the payment object reader device of FIG. 4A with H-bridges added permitting sharing of the display screen and audio output between the main processor and the secure processor, where the H-bridges are outside of the secure enclosure.


The H-bridge 320A of FIG. 4E permits sharing of the audio output 464 between the main processor 110 and the secure processor 120 in a similar manner to the H-Bridge 320 of FIG. 3B. The H-bridge 320B of FIG. 4E permits sharing of the display screen 466 between the main processor 110 and the secure processor 120 in a manner that is also similar to the H-Bridge 320 of FIG. 3B.


Resistors R1 and R2 positioned as in FIG. 3B are not illustrated in FIG. 4E but can be present between each H-bridge 320 and the corresponding audio output 464 or display screen 466, either within the secure enclosure 270 or outside of the secure enclosure 270.



FIG. 4F is a block diagram of the payment object reader device of FIG. 4B, where the payment object reader is separated from a host device.


The payment object reader device 400 of FIG. 4F includes mainly components associated with the secure processor 120 in FIGS. 4A-4E, while the separate host device 490 includes mainly components associated with the main processor 110 in FIGS. 4A-4E. Any combination of components within either the payment object reader device 400 or the host device 490 may be moved from one device to the other. For instance, the touch-sensitive surface circuitry 492 may be moved to the host device 490, or any or both switches 130 may be moved to the host device 490. In some cases, duplicate components, such as a charge connector 452 or even a main processor 110 outside of the secure enclosure 180, may exist in both the payment object reader device 400 and the host device 490 of FIG. 4F.


The connections between the payment object reader device 400 and the host device 490 of FIG. 4F may be wired or wireless, and may for example use a Universal Serial Bus (USB) connection, an Apple® Lightning® connection, or a TRS or TRRS audio/microphone jack connection, a Bluetooth® connection, a Bluetooth® Low Energy® connection, a 802.11 Wi-Fi connection, a wireless local area network (WLAN) connection, an Ethernet connection, a local area network (LAN) connection, or some combination thereof.


It should be understood that the payment object reader devices 400 of FIGS. 4C-4E can also be “split” into a payment object reader device 400 and a separate host device 490 in a similar manner to the “split” between FIG. 4B and FIG. 4F.


A payment object reader device 400 can be made with any combination of features/elements illustrated in and/or discussed with respect to FIGS. 3A-3B, FIGS. 4A-4F, FIGS. 5A-5B, FIGS. 6A-6B, or FIG. 7.



FIG. 5A is a circuit diagram of an H-bridge with switches and two control inputs.


The H-bridge 320 of FIG. 5A and FIG. 5B includes two load connectors—580 and 585—that connect to a load 550. The load 550 may, for example, represent the speaker output 330 or headset output 340 of FIG. 3A or FIG. 3B. The load 550 may also represent any output device circuitry 150 or output device 170 discussed with respect to FIG. 5.


The H-bridge 320 of FIG. 5A includes four switches—a switch 510A, a switch 515A, a switch 520A, and a switch 525A. A first control input 540 controls the on/off state of switches 515A and 520A. A second control input 545 controls the on/off state of switches 510A and 525A.


When all of the switches are in their “off” state—that is, when they are all open and not conducting—the secure processor 120 does not drive output (e.g., audio) to the load 550. This state occurs when the first control input 540 and second control input 545 are in an “off” state.


When the first control input 540 is in an “on” state, the switches 515A and 520A are in an “on” state—that is, a closed state—and therefore conduct. Power is driven through the load 550 and through closed switches 515A and 520A.


When the second control input 545 is in an “on” state, the switches 510A and 525A are in an “on” state—that is, a closed state—and therefore conduct. Power is driven through the load 550 and through closed switches 510A and 525A.


The H-bridge 320 of FIGS. 5A and 5B may receive a supply voltage (V+ 530), a logic voltage (VDD 535), or both. The supply voltage (V+ 530) and/or logic voltage (VDD 535) may be supplied by the secure processor 120, the main processor 110, a battery within or outside of the secure enclosure 180, some other power source, or some combination thereof.



FIG. 5B is a circuit diagram of an H-bridge with transistors and four control inputs.


The four switches 510A, 515A, 520A, and 525A of FIG. 5A have been replaced in FIG. 5B with four transistors 510B, 515B, 520B, and 525B.


In FIG. 5B the GATE lines of each transistor go to separate control inputs. That is to say, control input 560 controls the gate line of transistor 510B, control input 565 controls the gate line of transistor 515B, control input 570 controls the gate line of transistor 520B, and control input 575 controls the gate line of transistor 525B.


The H-bridge 320 otherwise functions similarly to the one in FIG. 5A. Typically, control inputs 560 and 575 would be used together to drive the load 550, or control inputs 565 and 570 would be used together to drive the load 550.


The H-bridge 320 can be made with any combination of features/elements illustrated in and/or discussed with respect to FIG. 5A and FIG. 5B or discussed with regard to the same.



FIG. 6A is a block diagram of an H-bridge ASIC with two control inputs.


The H-bridge 320 of FIG. 6A is illustrated as an ASIC block with various connections. The control inputs of FIG. 6A correspond to those of the H-bridge circuit diagram in FIG. 5A—that is, there are two control inputs 540 and 545.


The secure processor 120 is illustrated as connecting to the control inputs of FIGS. 6A and 6B via GPIO. The control inputs of FIGS. 6A and 6B may be controlled by the secure processor 120, the main processor 110, another processor, or some combination thereof.


As in FIGS. 5A and 5B, the supply voltage (V+ 530) and/or logic voltage (VDD 535) may be supplied by the secure processor 120, the main processor 110, a battery within or outside of the secure enclosure 180, some other power source, or some combination thereof. In FIGS. 6A and 6B, these are illustrated as supplied by a voltage VCC1 610 and VCC2 615, respectively. Various circuit components, such as filters, resistors, capacitors, inductors, or combinations thereof are not illustrated but may be present between VCC1 610 and VDD 535, and between VCC2 615 and V+ 530. In some cases, VCC1 610 is the same at the voltage as VCC2 615, and may even be the same voltage source.



FIG. 6B is a block diagram of an H-bridge ASIC with four control inputs.


The control inputs of FIG. 6B correspond to those of the H-bridge circuit diagram in FIG. 5B—that is, there are four control inputs 560, 565, 570, and 575.


The H-bridge 320 illustrated in any of FIGS. 3A through 6B may also be used to amplify signals, such as audio signals, going to the load 550 from the secure processor 120 or from another source such as the main processor 110. The H-bridge 320 may be paired with filters, such as low-pass filters, high-pass filters, band-pass filters, or some combination thereof. Such filters may help provide cleaner audio output, since the secure processor 120 might omit an audio codec similar to the audio codec 310 of FIG. 3A and FIG. 3B. On the other hand, the secure processor 120 may use or be connected to an audio codec similar to the audio codec 310 of FIG. 3A and FIG. 3B, which may be coupled to the H-bridge instead of or in addition to the secure processor 120.


In some cases, the H-bridge 320 may be replaced with or supplemented by a switch or transistor set up similarly to the switch illustrated in FIG. 5.


The H-bridge 320 can be made with any combination of features/elements illustrated in and/or discussed with respect to FIG. 5A, FIG. 5B, FIG. 6A, and FIG. 6B or discussed with regard to the same.



FIG. 7 illustrates exemplary circuit board components 700 that may be used to implement an embodiment of the present invention. The circuit board 100 described herein may include any combination of at least a subset of the circuit board components 700. In some embodiments, the circuit board 100 may actually include multiple circuit boards connected in a wired or wireless fashion, some of which may be at least partially enclosed by the security housing.


The circuit board components 700 of FIG. 7 may include one or more processors, controllers, or microcontrollers 710. These may in some cases aid in tamper detection, such as by performing at least some subset of the functions identified in FIG. 7. The circuit board components 700 of FIG. 7 may include one or more memory components 710 that may store, at least in part, instructions, executable code, or other data for execution or processing by the processor or controller 710. The memory components 710 may include, for example, cache memory, random access memory (5AM), read-only memory (ROM), or some other type of computer-readable storage medium.


The circuit board components 700 of FIG. 7 may further includes one or more computer-readable storage medium(s) 730 for storing data, such as a hard drive, magnetic disk drive, optical disk drive, flash memory, magnetic tape based memory, or another form of non-volatile storage. These may, for example, store credit card information, cryptographic keys, or other information, and may in some cases encrypt or decrypt such information with the aid of the processor or controller 710. The computer-readable storage medium(s) 730 may in some cases store, at least in part, instructions, executable code, or other data for execution or processing by the processor or controller 710.


The circuit board components 700 of FIG. 7 may include tamper detection circuitry 740, which may include any of the tamper detection circuit 150 discussed herein, and may include the board connector piece holder(s) 255 and any components discussed in FIG. 7.


The circuit board components 700 of FIG. 7 may include output device circuitry 750, which may include, for example, communication circuitry for outputting data through wired or wireless means, display circuitry for displaying data via a display screen, audio circuitry for playing audio via headphones or a speaker, printer circuitry for printing data via a printer, or some combination thereof. The display screen may be a liquid crystal display (LCD), a plasma display, an organic light-emitting diode (OLED) display, an electronic ink display, a projector-based display, a holographic display, or some combination thereof. The printer may be inkjet, laserjet, thermal, or some combination thereof. In some cases, the output device circuitry 750 may allow for transmission of data over an headphone audio jack, a microphone jack, BLUETOOTH™ wireless signal transfer, radio-frequency identification (RFID), near-field communications (NFC), 802.11 Wi-Fi, cellular network data transfer, or some combination thereof. The output device circuitry 750 may also include


The circuit board components 700 of FIG. 7 may include input device circuitry 760, which may include, for example, communication circuitry for outputting data through wired or wireless means, microphone circuitry for receiving audio data, user interface circuitry for receiving user interface inputs, or some combination thereof, and may include variable pressure detection. Touchscreens may be capacitive, resistive, acoustic, or some combination thereof. In some cases, the input device circuitry 760 may allow receipt of data over an headphone audio jack, a microphone jack, BLUETOOTH™ wireless signal transfer, radio-frequency identification (RFID), near-field communications (NFC), 802.11 Wi-Fi, cellular network data transfer, or some combination thereof. Input device circuitry 760 may receive data from an alpha-numeric keypad or keyboard, a pointing device, a mouse, a trackball, a trackpad, a touchscreen, a stylus, cursor direction keys, or some combination thereof. The input device circuitry 760 may also receive data from the transaction object reader circuitry 770.


The circuit board components 700 of FIG. 7 may include transaction object reader circuitry 770, which may include components capable of reading information from a transaction object, or may include circuitry supporting components capable of reading information from a transaction object, with the actual object reader components located off of the circuit board 100. The transaction object reader 770 may include at least one card reader. In this case, the transaction object may be a magnetic stripe onboard a transaction card, an integrated circuit (IC) chip onboard a transaction card, and/or a smartcard chip onboard a transaction card. The transaction card itself may be a credit card, a debit card, an automated teller machine (ATM) card, a gift card, a transit card, an identification card, a game token card, a ticket card, a bank card associated with a bank account, a credit union card associated with a credit union account, an online gaming card associated with an online gaming account, a healthcare card associated with a health savings account (HSA) or flexible spending account (FSA), or a user account card associated with a user account of another type, or some combination thereof. The transaction object reader 770 may include at least one wireless signal reader for reading information wirelessly. In this case, the transaction object may be any of the transaction-card-related transaction objects discussed above (but read wirelessly), or they may be non-card objects capable of wireless communication, such as smartphones, tablets, wearable devices, active near field communication (NFC) and/or radio-frequency identification (RFID) tags, passive NFC and/or RFID tags, or other mobile devices that are capable of wireless communication via NFC, RFID, Bluetooth®, Bluetooth® Low Energy®, WLAN, Wi-Fi, or some combination thereof.


Transaction object reader circuitry 770 may include, for example, a magnetic read head or other type of magnetic stripe reader that is capable of reading information from a magnetic stripe of a transaction card. Transaction object reader circuitry 770 can also include an integrated circuit (IC) chip reader and/or smartcard chip reader for reading an IC chip and/or smartcard chip embedded in a transaction card. Such an IC chip/smartcard chip can follow the Europay-Mastercard-Visa (EMV) payment chip standard. The IC chip/smartcard chip reader can be contact-based, in that it can include one or more conductive prongs that contact a conductive metal contact pad of the IC chip/smartcard chip. The IC chip/smartcard chip can instead be contactless and use a contactless antenna. The contactless antenna can also double as a receiver for near-field-communication (NFC) signals, radio-frequency identification (RFID) signals, Bluetooth® wireless signals, wireless local area network (WLAN) signals, 802.xx Wi-Fi signals, or some combination thereof, which can be sent from a transaction card or from a another type of transaction object as discussed above. In some cases, a transaction object may only send these wireless signals in response to receipt of a magnetic field or other wireless signals from the transaction object reader circuitry 770. For example, if the transaction object is a passive NFC/RFID tag or functions based on similar technology, it generates energy from the magnetic field or other wireless signals from the transaction object reader circuitry 770 via induction coil(s) that is then used to transmit the wireless signals that are ultimately read by the transaction object reader circuitry 770.


The information read from the transaction object by the transaction object reader circuitry 770, regardless of the type of the transaction object, may include at least credit card information, debit card information, automated teller machine (ATM) information, gift card account information, transit account information, identification card information, game token card information, ticket information, bank account information, credit union account information, online gaming account information, HSA/FSA account information, health insurance account information, healthcare information, or some combination thereof. Certain terms discussed herein should be understood to refer to transaction objects, including but not limited to “payment object,” “transaction object,” “financial object,” “payment card,” “transaction card,” or “financial card.”


Peripheral circuitry 780 may include any type circuitry permitting connection and use of computer support devices to add additional functionality to the circuit board 100. For example, peripheral circuitry 780 may support connection of a modem or a router. The components shown in FIG. 7 are depicted as being connected via a single bus 790. However, the components may be connected through one or more data transport means. For example, processor unit 710 and main memory 710 may be connected via a local microprocessor bus, and the storage medium 730, tamper detection circuitry 740, output device circuitry 750, input device circuitry 760, transaction object reader circuitry 770, and peripheral circuitry 780 may be connected via one or more input/output (I/O) buses.


While various flow diagrams have been described above, it should be understood that these show a particular order of operations performed by certain embodiments of the invention, and that such order is exemplary. Alternative embodiments can perform the operations in a different order, combine certain operations, or overlap certain operations illustrated in or described with respect to each flow diagram.


The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.

Claims
  • 1. A system for controlling access to an output device, the system comprising: a first processor that generates a first output and provides the first output to the output device through a switch while the switch is in a first state;a second processor that generates a second output and provides the second output to the output device through the switch while the switch is in a second state;the switch, wherein the switch is toggled between a plurality of states that includes the first state and the second state, wherein the switch electrically couples the output device to the first processor in the first state, wherein the switch electrically couples the output device to the second processor in the second state; andthe output device, wherein the output device outputs the first output when the switch is in the first state, wherein the output device outputs the second output when the switch is in the second state.
  • 2. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting that the second processor is compromised.
  • 3. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting, using tamper detection circuitry, an attempt to tamper with circuitry associated with the second processor.
  • 4. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting, using tamper detection circuitry, an attempt to tamper with circuitry associated with the switch.
  • 5. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting that the second processor is at least one of: malfunctioning, misbehaving, or acting unusually.
  • 6. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting an indication of detection of malware associated with the second processor, wherein the malware includes at least one of a virus, adware, or spyware.
  • 7. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting an attempt to spoof a user interface by the second processor.
  • 8. The system of claim 1, wherein the first processor is configured to toggle the switch between the plurality of states in response to detecting an indication of at least one of unexpected network activity or unsanctioned network activity.
  • 9. The system of claim 1, wherein the output device includes at least one of a display, a speaker, a headset, a printer, or a network interface.
  • 10. The system of claim 1, wherein the first output and the second output include, respectively, at least one of: user interfaces, sounds, printed documents, or network data transmissions.
  • 11. The system of claim 1, further comprising: a secure enclosure protected by tamper detection circuitry, wherein the secure enclosure includes at least the switch and the first processor.
  • 12. The system of claim 11, wherein the secure enclosure also includes at least a portion of the output device.
  • 13. The system of claim 1, further comprising: an H-bridge, wherein the H-bridge includes the switch.
  • 14. A method of controlling access to an output device between two processors, the method comprising: generating a first output at a first processor;generating a second output at a second processor;transmitting a first output instruction from the first processor to the output device through a switch while the switch is in a first state, the first output instruction instructing the output device to output the first output;outputting the first output using the output device while the switch is in the first state;toggling the switch from the first state to a second state;transmitting a second output instruction from the second processor to the output device through the switch while the switch is in the second state, the second output instruction instructing the output device to output the second output; andoutputting the second output using the output device while the switch is in the second state.
  • 15. The method of claim 14, wherein the toggling of the switch from the first state to the second state includes the second processor toggling the switch from the first state to the second state.
  • 16. The method of claim 14, further comprising: detecting that the first processor is compromised, wherein the toggling of the switch from the first state to the second state occurs in response to detecting that the first processor is compromised.
  • 17. The method of claim 14, further comprising: detecting, using tamper detection circuitry, an attempt to tamper with circuitry associated with at least one of the first processor or the switch, wherein toggling the switch from the first state to the second state occurs in response to detecting the attempt to tamper.
  • 18. The method of claim 14, further comprising: detecting malware associated with the first processor, wherein the toggling of the switch from the first state to the second state occurs in response to detecting the malware, wherein the malware includes at least one of a virus, adware, or spyware.
  • 19. The method of claim 14, further comprising: detecting an attempt to spoof a user interface by the first processor, wherein the toggling of the switch from the first state to the second state occurs in response to detecting the attempt to spoof the user interface by the first processor.
  • 20. The method of claim 14, further comprising: detecting an indication of unexpected or unsanctioned network activity, wherein the toggling of the switch from the first state to the second state occurs in response to detecting the indication of at least one of unexpected network activity or unsanctioned network activity.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. Non-Provisional patent application Ser. No. 15/836,713, entitled SHARING OUTPUT DEVICE BETWEEN UNSECURED PROCESSOR AND SECURED PROCESSOR, filed Dec. 8, 2017, which claims the benefit of U.S. Provisional Patent Application No. 62/578,657, entitled SHARING OUTPUT DEVICE BETWEEN UNSECURED PROCESSOR AND SECURED PROCESSOR, filed Oct. 30, 2017, the contents of which are incorporated herein by reference in their entireties.

US Referenced Citations (169)
Number Name Date Kind
3128349 Nash Apr 1964 A
4758714 Carlson Jul 1988 A
4776003 Harris Oct 1988 A
4860336 D'Avello Aug 1989 A
5221838 Gutman Jun 1993 A
5351296 Sullivan Sep 1994 A
5388155 Smith Feb 1995 A
5408513 Busch, Jr. Apr 1995 A
5714741 Pieterse Feb 1998 A
5729591 Bailey Mar 1998 A
5740232 Pailles Apr 1998 A
5752046 Oprescu May 1998 A
5838773 Eisner Nov 1998 A
5850599 Seiderman Dec 1998 A
5867795 Novis Feb 1999 A
5940510 Curry Aug 1999 A
6010067 Elbaum Jan 2000 A
6065679 Levie May 2000 A
6098881 DeLand, Jr. Aug 2000 A
6144336 Preston Nov 2000 A
6234389 Valliani May 2001 B1
6278779 Bryant Aug 2001 B1
6481623 Grant Nov 2002 B1
6600823 Hayosh Jul 2003 B1
6886742 Stoutenburg May 2005 B2
6990683 Itabashi Jan 2006 B2
7003316 Elias Feb 2006 B1
7066382 Kaplan Jun 2006 B2
7083090 Zuili Aug 2006 B2
7124937 Myers Oct 2006 B2
7163148 Durbin Jan 2007 B2
7210627 Morley, Jr. May 2007 B2
7343496 Hsiang Mar 2008 B1
7363054 Elias et al. Apr 2008 B2
7424732 Matsumoto Sep 2008 B2
7433452 Taylor Oct 2008 B2
7502878 Wright Mar 2009 B1
7506956 Usui Mar 2009 B2
7515962 Lyden Apr 2009 B2
7591425 Zuili et al. Sep 2009 B1
7673799 Hart Mar 2010 B2
7810729 Morley, Jr. Oct 2010 B2
7896248 Morley, Jr. Mar 2011 B2
8086531 Litster Dec 2011 B2
8126734 Dicks Feb 2012 B2
8265553 Cheon Sep 2012 B2
8355003 Pope Jan 2013 B2
8397988 Zuili Mar 2013 B1
8553055 Martell Oct 2013 B1
8702007 Yisraelian Apr 2014 B2
9020853 Hoffman Apr 2015 B2
9092766 Bedier Jul 2015 B1
9223376 Derbyshire Dec 2015 B2
9344281 Kobres May 2016 B2
9355277 Kobres May 2016 B2
9396368 Lamba Jul 2016 B1
9489703 Kauniskangas Nov 2016 B2
9529758 Szeto Dec 2016 B1
9590747 Thoukydides Mar 2017 B2
9607181 Matsumoto Mar 2017 B2
9659441 Kelly May 2017 B2
9679286 Colnot et al. Jun 2017 B2
9721247 Bedier Aug 2017 B2
9792783 Beatty Oct 2017 B1
9824350 Dorsey et al. Nov 2017 B2
10140604 Douthat et al. Nov 2018 B1
10182328 Maibach Jan 2019 B1
10380471 Locke et al. Aug 2019 B2
10679197 Gantert et al. Jun 2020 B1
10713904 Beatty Jul 2020 B2
10733589 Douthat Aug 2020 B2
10970698 Binder Apr 2021 B1
11257058 Douthat Feb 2022 B1
20020091633 Proctor Jul 2002 A1
20020153414 Stoutenburg Oct 2002 A1
20030135418 Shekhar Jul 2003 A1
20030154414 von Mueller Aug 2003 A1
20030183691 Lahteenmaki Oct 2003 A1
20030200108 Malnoe Oct 2003 A1
20040012875 Wood Jan 2004 A1
20040041911 Odagiri Mar 2004 A1
20040059682 Hasumi Mar 2004 A1
20040088449 Sakaki May 2004 A1
20040167820 Melick Aug 2004 A1
20040204082 Abeyta Oct 2004 A1
20040251908 Knopf Dec 2004 A1
20050097015 Wilkes May 2005 A1
20050109841 Ryan May 2005 A1
20050179956 Silverbrooks Aug 2005 A1
20050236480 Vrotsos Oct 2005 A1
20060032905 Bear Feb 2006 A1
20060049255 von Mueller Mar 2006 A1
20060049256 von Mueller Mar 2006 A1
20060056401 Bohm Mar 2006 A1
20060223580 Antonio Oct 2006 A1
20070067833 Colnot Mar 2007 A1
20070168300 Quesselaire Jul 2007 A1
20070194104 Fukuda Aug 2007 A1
20070198436 Weiss Aug 2007 A1
20070255885 Bohm Nov 2007 A1
20080091617 Hazel Apr 2008 A1
20080104631 Krock May 2008 A1
20080148394 Poidomani Jun 2008 A1
20080238687 Ozer Oct 2008 A1
20090070583 von Mueller et al. Mar 2009 A1
20090112768 Hammad Apr 2009 A1
20090164326 Bishop Jun 2009 A1
20090271270 Regmi Oct 2009 A1
20090317161 Vo Dec 2009 A1
20100057620 Li Mar 2010 A1
20100220136 Sheahan Sep 2010 A1
20100243732 Wallner Sep 2010 A1
20110019234 Nakamura Jan 2011 A1
20110078034 Hayhow Mar 2011 A1
20110080422 Lee Apr 2011 A1
20110321173 Weston Dec 2011 A1
20120039469 Mueller Feb 2012 A1
20120060041 Hashimoto Mar 2012 A1
20120203620 Dobyns Aug 2012 A1
20130013515 Walters Jan 2013 A1
20130079037 Dobyns Mar 2013 A1
20130094668 Poulsen Apr 2013 A1
20130103190 Carapelli Apr 2013 A1
20130110678 Vigier May 2013 A1
20130262708 McLeod Oct 2013 A1
20140021254 Marshall Jan 2014 A1
20140108241 Tunnell Apr 2014 A1
20140127995 Hendricksen May 2014 A1
20140191913 Ge Jul 2014 A1
20140206339 Lindoff Jul 2014 A1
20140241523 Kobres Aug 2014 A1
20140249942 Hicks Sep 2014 A1
20140268458 Luciani Sep 2014 A1
20140295777 Wang Oct 2014 A1
20140368339 Thaker Dec 2014 A1
20150199882 Fernando Jul 2015 A1
20150227485 Maung Aug 2015 A1
20150254621 Matsumoto Sep 2015 A1
20150269805 Korala Sep 2015 A1
20150302708 Hattori Oct 2015 A1
20160014623 Tanner Jan 2016 A1
20160064979 Huang Mar 2016 A1
20160098690 Silva Apr 2016 A1
20160117662 Bedier Apr 2016 A1
20160125376 Beatty May 2016 A1
20160154967 Lee et al. Jun 2016 A1
20160174038 Menardais Jun 2016 A1
20160211843 Wang Jul 2016 A1
20160275478 Li et al. Sep 2016 A1
20160307010 Ge Oct 2016 A1
20160307171 Haga Oct 2016 A1
20160335132 Ash Nov 2016 A1
20160342819 Lamba et al. Nov 2016 A1
20170004485 Lee Jan 2017 A1
20170017943 Bilhan Jan 2017 A1
20170076269 Saeed Mar 2017 A1
20170160819 Yi Jun 2017 A1
20170220822 Kobres Aug 2017 A1
20170255927 Dorsey Sep 2017 A1
20170300893 Sasaki Oct 2017 A1
20170309135 Beatty Oct 2017 A1
20170309137 Shah Oct 2017 A1
20170337403 Ohno Nov 2017 A1
20180026373 Schwent Jan 2018 A1
20180314661 Douthat Nov 2018 A1
20180316815 Douthat Nov 2018 A1
20180366978 Matan Dec 2018 A1
20200334657 Douthat Oct 2020 A1
20210216988 Binder Jul 2021 A1
Foreign Referenced Citations (77)
Number Date Country
2324402 Jun 2002 AU
20320080 Apr 2004 DE
201 22 899 Sep 2009 DE
0 895 203 Feb 1999 EP
1205895 May 2002 EP
1 874 014 Jan 2008 EP
2 965 167 Jan 2016 EP
3 152 666 Apr 2017 EP
2 812 744 Feb 2002 FR
2 812 745 Feb 2002 FR
2 834 156 Jun 2003 FR
H09231285 Sep 1997 JP
2000-030146 Jan 2000 JP
2000-071580 Mar 2000 JP
2000-276539 Oct 2000 JP
2001-222595 Aug 2001 JP
2002-074507 Mar 2002 JP
2002-123771 Apr 2002 JP
2002-137506 May 2002 JP
2002-279320 Sep 2002 JP
2002-352166 Dec 2002 JP
2002-358285 Dec 2002 JP
2003-108777 Apr 2003 JP
2003-281453 Oct 2003 JP
2003-308438 Oct 2003 JP
2003-316558 Nov 2003 JP
2004-054651 Feb 2004 JP
2004-062733 Feb 2004 JP
2004-078553 Mar 2004 JP
2004-078662 Mar 2004 JP
2004-157604 Jun 2004 JP
2004-199405 Jul 2004 JP
2004-351899 Dec 2004 JP
2006-195589 Jul 2006 JP
2007-042103 Feb 2007 JP
2008-176390 Jul 2008 JP
4248820 Apr 2009 JP
2010-218196 Sep 2010 JP
2011-138424 Jul 2011 JP
2013-511787 Apr 2013 JP
2013-086448 May 2013 JP
2013-222444 Oct 2013 JP
2014-232479 Dec 2014 JP
2015-170356 Sep 2015 JP
2016-514442 May 2016 JP
2017-056698 Mar 2017 JP
2021-177405 Nov 2021 JP
10-1999-0066397 Aug 1999 KR
10-1999-0068618 Sep 1999 KR
200225019 Mar 2001 KR
10-2003-0005936 Jan 2003 KR
10-2003-0005984 Jan 2003 KR
10-2003-0012910 Feb 2003 KR
200333809 Nov 2003 KR
10-2004-0016548 Feb 2004 KR
100447431 Aug 2004 KR
200405877 Jan 2006 KR
100649151 Nov 2006 KR
10-2007-0107990 Nov 2007 KR
100842484 Jul 2008 KR
2284578 Sep 2006 RU
1998012674 Mar 1998 WO
2000011624 Mar 2000 WO
2000025277 May 2000 WO
2001086599 Nov 2001 WO
2002033669 Apr 2002 WO
2002043020 May 2002 WO
2002082388 Oct 2002 WO
2002084548 Oct 2002 WO
2003044710 May 2003 WO
2003079259 Sep 2003 WO
2004023366 Mar 2004 WO
2006131708 Dec 2006 WO
2014116235 Jul 2014 WO
2017053699 Mar 2017 WO
2018200730 Nov 2018 WO
2018200732 Nov 2018 WO
Non-Patent Literature Citations (57)
Entry
“MSP430x1xx Family User's Guide,” (including 2016 correction sheet at 2), Texas Instruments Inc., 2006.
Spegele, Joseph Brain., “A Framework for Evaluating Application of Smart Cards and Related Technology Within the Department of Defense,” Naval Postgraduate School, Jan. 1995.
Stephen A. Sherman et al., “Secure Network Access Using Multiple Applications of AT&T's Smart Card,” AT&T Technical Journal, Sep./Oct. 1994.
Non-Final Office Action dated Jul. 28, 2017, for U.S. Appl. No. 15/597,035, of Douthat, C., et al., filed May 16, 2017.
Final Office Action dated Jan. 12, 2018, for U.S. Appl. No. 15/597,035, of Douthat, C., et al., filed May 16, 2017.
Non-Final Office Action dated Apr. 16, 2018, for U.S. Appl. No. 15/620,642, of Maibach, M.H., et al., filed Jun. 12, 2017.
Notice of Allowance dated Jul. 18, 2018, for U.S. Appl. No. 15/597,035, of Douthat, C., et al., filed May 16, 2017.
Notice of Allowance dated Sep. 6, 2018, for U.S. Appl. No. 15/620,642, of Maibach, M.H., et al., filed Jun. 12, 2017.
Non-Final Office Action dated Sep. 7, 2018, for U.S. Appl. No. 15/582,174, of Douthat, C., et al., filed Apr. 28, 2017.
Final office Action dated Mar. 11, 2019, for U.S. Appl. No. 15/582,174, of Douthat, C., et al., filed Apr. 28, 2017.
Non-Final Office Action dated Jul. 25, 2019, for U.S. Appl. No. 15/599,826, of Dorsey, J., et al., filed May 19, 2017.
Non-Final Office Action dated Oct. 1, 2019, for U.S. Appl. No. 15/582,166, of Douthat, C., et al., filed Apr. 28, 2017.
Non-Final Office Action dated Jun. 15, 2020, for U.S. Appl. No. 15/836,753, of Binder, J. C., et al., filed Dec. 8, 2017.
Notice of Allowance dated Oct. 24, 2019, for U.S. Appl. No. 15/582,174, of Douthat, C., et al., filed Apr. 28, 2017.
Notice of Allowance dated Jan. 27, 2020, for U.S. Appl. No. 15/599,826, of Dorsey, J., et al., filed May 19, 2017.
Non-Final Office Action dated Jan. 27, 2020, for U.S. Appl. No. 15/836,713, of Douthat, C., et al., filed Dec. 8, 2017.
Notice of Allowance dated Mar. 25, 2020, for U.S. Appl. No. 15/582,166, of Douthat, C., et al., filed Apr. 28, 2017.
Final Office Action dated Aug. 21, 2020, for U.S. Appl. No. 15/836,713, of Douthat, C., et al., filed Dec. 8, 2017.
Final Office Action dated Oct. 5, 2020, for U.S. Appl. No. 15/836,753, of Binder, J. C., et al., filed Dec. 8, 2017.
Advisory Action dated Nov. 4, 2020, for U.S. Appl. No. 15/836,753, of Binder, J. C., et al., filed Dec. 8, 2017.
Notice of Allowance dated Dec. 10, 2020, for U.S. Appl. No. 15/836,753, of Binder, J. C., et al., filed Dec. 8, 2017.
Notice of Allowance dated Jan. 12, 2021, for U.S. Appl. No. 15/836,753, of Binder, J. C., et al., filed Dec. 8, 2017.
Non-Final Office Action dated Mar. 23, 2021, for U.S. Appl. No. 15/836,713, of Douthat, C., et al., filed Dec. 8, 2017.
Non-Final Office Action dated May 12, 2021, for U.S. Appl. No. 16/923,671, of Douthat, C., et al., filed Jul. 8, 2020.
Final Office Action dated Nov. 10, 2021, for U.S. Appl. No. 16/923,671, of Douthat, C., et al., filed Jul. 8, 2020.
Notice of Allowance dated Sep. 27, 2021, for U.S. Appl. No. 15/836,713, of Douthat, C., et al., filed Dec. 8, 2017.
Advisory Action dated Jan. 24, 2022, for U.S. Appl. No. 16/923,671, of Douthat, C., et al., filed Jul. 8, 2020.
European Office Action for European Patent Application No. 18742856.0, dated Feb. 18, 2020.
Intention to Grant European Patent Application No. 18724079.1, dated May 20, 2020.
Office Action for European Patent Application No. 18724079.1, dated Sep. 29, 2020.
Examiner Requisition for Canadian Patent Application No. 3059245, mailed on Nov. 12, 2020.
English language translation of Notice of Reason for Refusal for Japanese Patent Application No. 2019-554368, dated Nov. 20, 2020.
International Search Report and Written Opinion for International Application No. PCT/US2018/029451, dated Sep. 17, 2018.
International Search Report and Written Opinion for International Application No. PCT/US2018/029449, dated Jul. 31, 2018.
Examiner Requisition for Canadian Patent Application No. 3059051, dated Jan. 4, 2021.
English language translation of Office Action received in Japanese Patent Application No. 2019-554332, dated Jan. 5, 2021.
Office Action for European Patent Application No. 20177533.5, dated May 25, 2021.
English language translation of Decision to Grant for Japanese Patent Application No. 2019-554368, dated Jun. 25, 2021.
English language translation of Decision to Grant received in Japanese Patent Application No. 2019-554332, dated Jul. 2, 2021.
Summons to oral proceedings for European Patent Application No. 20177533.5, dated Oct. 25, 2021.
Examiner Requisition for Canadian Patent Application No. 3059051, dated Oct. 20, 2021.
Summons to oral proceedings for European Patent Application No. 18724079.1, dated Feb. 7, 2022.
“Connection of Terminal Equipment to the Telephone Network,” FCC 47 CFR Part 68, Retrieved from the URL: http://www.tscm.com/FCC47CFRpart68.pdf, on Sep. 24, 2019 Oct. 1, 1999 Edition.
“EMBEDDED FINancial transactional IC card READer,” Retrieved from the URL: https://cordis.europa.eu/project/rcn/58338/factsheet/en.
Geethapriya Venkataramani and Srividya Gopalan., “Mobile phone based RFID architecture for secure electronic payments using RFID credit cards,” 2007 IEEE, (ARES'07).
“Guideline for the Use of Advanced Authentication Technology,” FIPS 190, Sep. 28, 1994.
“Identification cards—Recording technique—Part 4—Location of read-only magnetic tracks—Track 1 and 2,” ISO/IEC 7811-4:1995, International Organization for Standardization, Aug. 1995.
Jerome Svigals., “The Long Life and Imminent Death of the Mag-stripe Card,” IEEE Spectrum, vol. 49, Issue 61, Jun. 2012.
“Magensa's Decryption Services and MagTek's MagneSafe™ Bluetooth Readers Selected by eProcessing Network to Implement Secure Customer Card Data with Mobile Devices,” Retrieved from the URL: https://www.magnensa.net/aboutus/articles/eProcessing-rev1.pdf Apr. 14, 2008.
Martha E. Haykin et al., “Smart Card Technology: New Methods for Computer Access Control,” NIST Special Publication 500-157, Sep. 1988.
Examiner Requisition for Canadian Patent Application No. 3059245, dated Mar. 21, 2022.
Non-Final Office Action dated Mar. 17, 2022, for U.S. Appl. No. 16/923,671, of Douthat, C., et al., filed Jul. 8, 2020.
Fujioka et al., “Security of Sequential Multiple Encryption”, Progress in Cryptology—LATINCRYPT 2010, pp. 20-39 (Aug. 8, 2010) (Abstract Only).
English language translation of Notice of Reasons for Refusal for Japanese Patent Application No. 2021-114327 dated Aug. 5, 2022.
Examiner Requisition for Canadian Patent Application No. 3059245, mailed on Aug. 5, 2022.
English Language Translation of Decision to Grant for Japanese Patent Application No. 2021-114327, dated Oct. 3, 2022.
Notice of Allowance dated Sep. 29, 2022, for U.S. Appl. No. 16/923,671, of Douthat, C., et al., filed Jul. 8, 2020.
Related Publications (1)
Number Date Country
20220164782 A1 May 2022 US
Provisional Applications (1)
Number Date Country
62578657 Oct 2017 US
Continuations (1)
Number Date Country
Parent 15836713 Dec 2017 US
Child 17667032 US