Bidirectional audio communication in reader devices

Information

  • Patent Grant
  • 9230143
  • Patent Number
    9,230,143
  • Date Filed
    Friday, December 19, 2014
    9 years ago
  • Date Issued
    Tuesday, January 5, 2016
    8 years ago
Abstract
Aspects of the subject disclosure provide a card reader for receiving payment card information at a mobile point-of-sale terminal. In some implementations, a reader of the subject technology can include a memory, a conditioning module and a 3.5 mm audio plug including an audio bus that is configured for insertion into a headphone port of a host device, such as a smart phone or tablet computer. Implementations of the subject technology also include a microprocessor configured to perform operations for receiving a training sequence for use in determining communication parameters associated with a mobile device, and in response to the training sequence, transmitting an acknowledgement signal to the mobile device, via the audio bus, to indicate that a communicative coupling with the mobile device has been successfully established.
Description
BACKGROUND

The ubiquity of headphone ports makes them an attractive option for use as communication channels for attachable hardware devices. For example, 3.5 mm audio ports are standard on many devices, and in particular mobile devices such as smart phones and tablet computers. Such ports can be used to provide communication between a host mobile device and a hardware attachment, such as an attachable card reader (e.g. “reader”) for reading information from the magnetic stripe of a payment card.





BRIEF DESCRIPTION OF THE DRAWINGS

Certain features of the subject technology are set forth in the appended claims. However, the accompanying drawings, which are included to provide further understanding, illustrate disclosed aspects and together with the description serve to explain the principles of the subject technology. In the drawings:



FIG. 1 illustrates a conceptual block diagram of hardware components used to facilitate bi-directional communication with a reader;



FIG. 2 illustrates a conceptual block diagram of the coupling between an audio bus and a conditioning module, according to certain aspects of the technology;



FIG. 3 illustrates an example process with which bi-directional reader communication can be implemented;



FIG. 4 illustrates an example reader, including a 3.5 mm audio plug, according to some embodiments; and



FIG. 5 depicts a conceptual environment in which a reader of the subject technology can be used to facilitate a financial transaction between a buyer and a merchant.





DETAILED DESCRIPTION

The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description, which includes specific details for the purpose of providing a more thorough understanding of the subject technology. However, it will be clear that the subject technology is not limited to the specific details set forth herein and may be practiced without these specific details. In some instances, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.


Due to limitations in transmitting data over audio ports, conventional readers are restricted to unidirectional communication and are only capable of sending signals to a host mobile device. Accordingly, when payment card information is read and transmitted by the reader, the reader has no way to acknowledge that the information has been properly received. In some conventional reader implementations, the reader can be configured to send multiple iterations of data to the host device (e.g., smart phone or tablet computer), in order to increase the likelihood that the data is received. However, it would be advantageous for the reader to receive signaling from the host device, for example, to provide acknowledgements of data receipt. Additionally, bi-directional communication over the audio port could provide several other advantages, including but not limited to, enabling the reader to configure reader-to-host communication, and for receiving reader firmware updates, etc.


Systems and methods in accordance with various aspects of the present disclosure overcome one or more of the above-referenced and other deficiencies in conventional reader-to-host communication. In particular, aspects of the subject technology provide solutions for enabling bi-directional communication over audio ports, such as standard 3.5 mm audio/headphone ports. As discussed in further detail below, bi-directional audio port communication can yield several advantages including, increased transmission bandwidth and session based management for reader communications.


For host-to-reader communication, audio signaling is transmitted by the host device, such as a mobile phone or tablet computer over an audio bus (such as the audio port or headphone jack). The audio signaling is then received by a reader on a left and/or right channel of the audio bus. Received signals are digitized e.g., using an analog-to-digital converter (ADC), before being provided to the microcontroller or processor of the reader.


Through the ability to receive audio signals, and thus, data, on one (or both) of the left/right audio channels, several host to reader communication features may be implemented. In some aspects, different kinds of data can be selected for transmission to the mobile host on separate audio channels. For instance, a timing or clock signal may be provided on one audio channel, such as the left channel, whereas data is provided on other channel (e.g., the right channel). As such, both channels may be utilized for the timing and transmission of data.


In another aspect, the receipt of signaling on multiple audio channels can be used to facilitate the encoding of information. By way of example, data may be encoded based on differences in signaling that is provided on left and right audio channels. As explained in further detail below a converter (e.g., including digital to analog converter and an analog to digital converter), may be used to facilitate the decoding of data based on signaling differences on the audio channel.


In yet another aspect, audio signaling received from a mobile host device may include data for use by the card reader. For example, a music signal may be modulated by the mobile host to include payload data for the card reader. By mixing payload data in an audio signal (e.g., a music signal), the card reader can receive payload data simultaneously with the broadcast of music over a headphone port of the mobile device.


For reader-to-host communication, digital information provided by the reader (e.g., a microcontroller of the reader) is converted into an analog signal, using a digital-to-analog converter (DAC), before transmission to a mobile device via an audio port. In certain aspects, Hi-Fi audio codecs can be used to represent digital information in an analog signal, or at what appears to be an analog (audio) signal to the signal processing of the host device or smart phone. Analog signaling transmitted by the reader is provided in the form of modulated current carried by a microphone bus e.g., of a standard 3.5 mm audio port. Once received, analog signaling is decoded and digitized using software and/or hardware of the host device.


In certain aspects, a conditioning module is used to offset received AC signaling in order to standardize signal properties (e.g., amplitude) and eliminate voltage components. Signal conditioning can also help to address variations caused by differences between different mobile devices, for example, due to differences in headphone volume settings etc. Conditioning ensures that voltage levels of the received signaling are adjusted to fall within a predetermined range before being provided to a microcontroller of the reader.


In certain aspects, connection of a reader to a host mobile device can trigger transmission of a predetermined training sequence from the mobile device to the reader. Through advanced knowledge of the training sequence, the reader can use the training sequence to assess a quality of the received signal. This assessment can be used to facilitate calibration of the reader e.g., by adjusting or tuning circuitry of the conditioning module. Once calibrated, the reader can be better configured to receive communication from the host mobile device.



FIG. 1 illustrates a conceptual block diagram of hardware components of a reader configured for bi-directional communication, according to some aspects of the subject technology. Reader 100 includes microcontroller 110, memory 120, digital-to-analog converter (DAC) 130, analog-to-digital converter (ADC) 140, conditioning module 150, and read head 160.


As illustrated, microcontroller 110 is coupled to memory 120, DAC 130 and ADC 140. Additionally, microcontroller 110 is coupled to conditioning module 150, via speaker channel 115, as well as read head 160, via ADC 140. In turn, ADC 130 is coupled to conditioning module 150, via microphone channel 105.


It is understood that reader 100 can be implemented using various other hardware components and/or configurations, and is not limited to the architecture depicted in FIG. 1. For example, microcontroller 110 can be implemented using a general-purpose processor, a microcontroller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a state machine, gated logic, discrete hardware components, or a combination of the foregoing.


Additionally, various types of memory can be utilized in place of, or in addition to, memory 120. Similarly, one or more sequences of instructions may be stored as firmware on a ROM within microcontroller 110. One or more sequences of instructions can also be software stored and read from another storage medium, such as the flash memory array, or received from a host device (e.g., a mobile device such as a smart phone or tablet computing device) via a host interface. ROM, storage mediums, and flash memory arrays represent examples of machine or computer readable media storing instructions/code executable by microcontroller 110. Machine or computer readable media may generally refer to any medium or media used to provide instructions to microcontroller 110, including both volatile media, such as dynamic memory used for storage media or for buffers within microcontroller 110, and non-volatile media, such as electronic media, optical media, and magnetic media.


Taken together, microphone channel 105 and speaker channel 115 can form a portion of an audio bus incorporating a standard 3.5 mm audio plug (not shown). In some implementations, speaker channel 115 can include multiple audio channels, such as a left-speaker channel and a right-speaker channel of a 3.5 mm headphone port.


As described in further detail below, audio signaling received on speaker channel 115 can be simultaneously provided to microcontroller 110 and conditioning module 150. Audio signaling received by the microcontroller can be used to provide voltage information to the conditioning module so that the conditioning module parameters can be tuned to provide proper voltage offsets for the received audio signals. In this way, the microcontroller can “listen” to the received audio signal while adjusting conditioning parameters in order to standardize the received signals before digital conversion and further processing are performed.


Conditioning module 150 can be configured to clamp incoming audio signals to eliminate negative voltage components. In certain aspects, an analog output of microcontroller 110 is provided to an ADC (e.g., ADC 130), and digital information about the received signal is output to conditioning module 150.


An example of a conceptual block diagram of the coupling between an audio bus and a conditioning module is illustrated in circuit 200 of FIG. 2. As illustrated, circuit 200 comprises an audio bus 210, that includes microphone channel 215, ground 225, right audio channel 230, and left audio channel 235. Schematic 200 further depicts a conditioning module 250, which includes a diode clamp 255. As discussed above, conditioning module 250 is coupled to, and configured to received audio signals from, audio bus 210. As discussed in further detail below, conditioning module 250 can also be configured to provide conditioned outputs to a microcontroller (not shown).


Although conditioning of an incoming audio signal can be performed using different circuit implementations than those illustrated in circuit 200, the illustrated implementation makes use of a diode clamp 255 for biasing received analog signals. As discussed above with respect to FIG. 1, biasing of the diode clamp can be based on inputs received from the microcontroller and provided to conditioning module 250 via a DAC, such as DAC 130.


Inputs received by conditioning module 250 from the microcontroller can help conditioning module 250 listen the incoming audio signal and adjust the signal accordingly, e.g., by biasing or thresholding the signal before it is provided to the microcontroller. An example process for conditioning incoming audio signals, and for facilitating bi-directional communication of a reader, is discussed in further detail with respect to FIG. 3.


Specifically, FIG. 3 illustrates an example process 300 for implementing bi-directional reader communication, beginning with step 302 in which a training sequence is received at a reader via an audio bus. In certain aspects, the training sequence is used by the reader to verify a connection with a host device (e.g., mobile device) and to determine communication parameters associated with the mobile device.


Although the audio bus can include various types of busses and/or audio channels, in certain implementations the audio bus is a standard audio bus, including a 3.5 mm audio plug, including left and right speaker channels, as described above with respect to speaker channel 115 of FIG. 1. The training sequence can be received on the left and/or right speaker channel, and can be used by the reader to determine if signaling from a host mobile device is being properly received.


By way of example, a training sequence received by the reader (e.g., microcontroller 110) from a smart phone, can be used to indicate a successful connection to the reader. Because the reader can be configured to know what the training sequence should look like, a comparison of the received training sequence with a training sequence profile can indicate to the reader whether adjustments can be made to the smart phone to improve signal quality. Although an analysis of the training sequence can indicate various parameters that require tuning, in some implementations, amplitude of the training sequence is used to determine whether volume settings of the smart phone need to be adjusted.


In step 304, a reference signal is provided to the conditioning module. Although the reference signal can be used to tune different parameters of the conditioning module depending on implementation, in some aspects the reference signal is used to convey an instantaneous (or near instantaneous) voltage level of a currently received audio signal to the conditioning module.


The reference signal can be derived and/or based on any AC signaling received by the reader, including the training sequence. Using the reference signal, the conditioning module can adjust a voltage bias that is applied to the received audio signaling, as indicated by step 306. For example, the reference signal can indicate when the voltage of a received audio signal is negative so that the conditioning module can bias the received signal (e.g., to a positive voltage value), before it is digitized and passed to the microcontroller. By actively “listening” to the received AC signaling, and providing feedback to the conditioning module, the reader can tune biasing and thresholding parameters, and thus can actively adapt to variations in received signaling. One advantage of the ability to engage in real-time tuning of the reader circuitry is the ability to adapt to variations in signals received across different host devices, or by similar devices with different signal properties, such as volume settings.


Although tuning of the reader (e.g., using the conditioning module) can be performed throughout a communication session with a host device, in certain implementations, tuning is only performed upon an initial connection with a host. That is, the conditioning module adjusts its biasing and thresholding based on the reference signal derived from the training sequence, and then no additional adjustments are made for the remainder of the communication session. In certain aspects, tuning performed by the conditioning module is performed on a session-by-session basis, where circuit characteristics are not tuned or updated until the beginning of a new session (e.g., initial connection to a host) is detected. Thus after initial tuning has been performed, the reader is configured to receive signaling from the mobile device for the remainder of the communication session, i.e., until the reader is removed from the headphone port of the mobile device.


In step 308, an acknowledgement is sent to the host (e.g., mobile device) to indicate that a communicative coupling with the mobile device has been successfully established. Reader-to-host communications can be provided on a microphone channel of a standard 3.5 mm audio port, such as microphone channel 105 discussed above in reference to FIG. 1. However, before transmission across the audio bus, reader-to-host communications must first be converted to an analog waveform, for example, using DAC 130.


The ability to provide signaling to the host via the microphone channel provides significant advantages over conventional unidirectional reader communication implementations. For example, information provided to the host can provide data necessary to manage communication with the reader in discreet sessions. As discussed, above session based communication can be used to determine when turning of the reader should be performed. Additionally, communications provided to the host can be used to provide indications of host parameters that can be adjusted, for example, to reduce a bit error rate (BER) for signaling received by the reader.


One potential difficulty in transmitting digital information between the reader and the host, is that low frequency audio signals may be easily passed over the AC coupling in the communication path. To address this issue, digital information can be pre-processed so that the digital data is “whitened,” resulting in an audio signal that meets a minimal frequency threshold. By ensuring a minimum frequency of the resultant audio signal, the process can help to avoid the loss of digital data passed over the AC coupling. Although whitening for the digital signal, either on the reader or host side, can be performed in various manners, in certain aspects the digital signal can be encrypted to ensure a sufficient Gaussian information distribution necessary to meet the minimum frequency requirements of the AC coupling.



FIG. 4 illustrates an example reader 400 according to some embodiments of the subject technology. As illustrated, reader 400 includes a housing 410 that is coupled to an audio plug 420 (e.g., a 3.5 mm audio plug).


Housing 410 contains the hardware components and circuitry of reader 400, as illustrated with respect to the example of FIG. 1. Additionally, body portion 410 includes a slot (not shown) through which a payment card, such as a credit or debit card, may be swiped. Passage of a magnetic stripe of the payment card past a read head (e.g., read head 160 contained in housing 410) can enable payment information to be received via the read head. The resulting signal provided by the read head is typically an analog signal that must be digitized e.g., using ADC 140, before the resulting digital information is provided to microcontroller 110.


Different types of information can be read from a magnetic stripe, depending on implementation. For example, user and payment card account information can be read from track 1 and track 2 of the magnetic stripe, respectively. However, in other implementations, any track (or combination of tracks) may be read from the magnetic stripe, including any combination, or all of tracks 1, 2 and 3.


As illustrated, body portion 410 is physically and communicatively coupled to audio plug 420, which can be removably inserted into a headphone port of a host device, such as a smart phone, personal computer, tablet device, or the like. As discussed above with respect to FIGS. 1 and 2, audio plug 420 forms part of an audio bus that includes left and right speaker channels, a microphone channel and a ground connection. Once audio plug 420 is inserted into the headphone port of a host device, such as a smart phone, bi-directional communication between reader 400 and the host is enabled e.g., via the left/right speaker channels and microphone channel, using the methods and systems discussed above.


Although the reader illustrated in FIG. 4 can accept payment cards containing a magnetic stripe (e.g., using a read head), it is understood that the reader can be configured to receive other types of payment cards, and accordingly can contain additional or different hardware and/or software modules than those described above with respect to FIG. 1. For example, housing 410 can include a dip slot for accepting integrated circuit cards (“IC cards”), such as those conforming to the Europay, Mastercard, and Visa (EMV) standard.


Once successful bidirectional communication has been established between the reader and its host, the reader can be used to facilitate a payment transaction, for example between a merchant and a buyer using a magnetic payment card. FIG. 5 depicts a conceptual environment in which a reader of the subject technology can be used to facilitate a financial transaction between a buyer and a merchant. Although the diagrams depict components as functionally separate, such depiction is merely for illustrative purposes. It will be apparent that the components portrayed in this figure can be arbitrarily combined or divided into separate software, firmware and/or hardware components. Furthermore, it will also be apparent that such components, regardless of how they are combined or divided, can execute on the same host or multiple hosts, and wherein multiple hosts can be connected by one or more networks.


In the example of FIG. 5, the system includes a mobile device 500, a reader 501 connected to mobile device 500, a decoding engine 510, a user interaction engine 520, and a transaction engine 530, all running on mobile device 500. Additionally, the system may also include one or more of a user database 540, a product or service database 550, and a transaction database 560, all coupled to the transaction engine 530.


As used herein, the term engine refers to software, firmware, hardware, and/or other components used to effectuate a purpose. The engine will typically include software instructions that are stored in non-volatile memory (also referred to as secondary memory). When the software instructions are executed, at least a subset of the software instructions is loaded into memory (also referred to as primary memory) by a processor. The processor then executes the software instructions in memory. The processor may be a shared processor, a dedicated processor, or a combination of shared or dedicated processors. A typical program will include calls to hardware components (such as I/O devices), which typically requires the execution of drivers. The drivers may or may not be considered part of the engine, but the distinction is not critical.


As used herein, the term database is used broadly to include any known or convenient means for storing data, whether centralized or distributed, relational or otherwise.


In the example of FIG. 5, mobile device 500 to which reader 501 is connected can be, but is not limited to, a cell phone, such as Apple's iPhone, other portable electronic devices, such as Apple's iPod Touches, Apple's iPads, and mobile devices based on Google's Android operating system and any other portable electronic device that includes software, firmware, hardware, or any combination capable of at least receiving the signal, decoding if needed, exchanging information with a transaction server to verify the buyer and/or seller's account information, conducting the transaction, and generating a receipt. Typical components of mobile device 500 can include but are not limited to persistent memories like flash ROM, random access memory like SRAM, a camera, a battery, LCD driver, a display, a cellular antenna, a speaker, a Bluetooth circuit, and WiFi circuitry, where the persistent memory may contain programs, applications, and/or an operating system for the mobile device.


In some implementations, a system is provided with transaction engine 530 running on mobile device 500. In response to a financial transaction between a buyer and a seller, mobile device 500 accepts information selected including but not limited to information from financial transaction or information pertaining to financial transaction card used by the buyer in the transaction. Additionally, a financial transaction device can be utilized, Non-limiting examples of financial transaction devices include but are not limited to a, wristband, RFID chip, cell phone, biometric marker and the like. At least a portion of this information is communicated with a third party financial institution or payment network to authorize the transaction.


Payment confirmation can be made with a communication channel of the buyer's choice. As non-limiting examples, confirmation of payment can be an electronic notification in the form selected from at least one of, email, SMS message, tweet (message delivered via Twitter), instant message, communication within a social network and the like. In response to the transaction, a confirmation is made that the buyer is authorized to use the financial transaction card. In certain implementations, a confirmation can be provided that indicates a sufficiency of funds available to the buyer.


In the example of FIG. 5, reader 501 is configured to read data encoded in a magnetic strip of a card being swiped by a buyer and send a signal that corresponds to the data read to mobile device 500. However, as discussed above, reader 501 may be configured to received various payment card types, including but not limited to IC cards that can be provided to reader 501 using a dip slot.


The size of reader 501 can be miniaturized to be portable for connection with mobile device 500. For example, the size of card reader 501 can be miniaturized to an overall length of less than 1.5″. In addition, the miniaturized card reader 501 is also designed to reliably read the card with minimum error via a single swipe by counteracting vendor specific filtering done by mobile device 501. Note that this broad overview is meant to be non-limiting as components to this process are represented in different embodiments.


Many of the above-described features and applications are implemented as software processes that are specified as a set of instructions recorded on a computer readable storage medium (also referred to as computer readable medium). When these instructions are executed by one or more processing unit(s) (e.g., one or more processors, cores of processors, or other processing units), they cause the processing unit(s) to perform the actions indicated in the instructions. Examples of computer readable media include, but are not limited to, flash drives, RAM chips, hard drives, EPROMs, etc. The computer readable media does not include carrier waves and electronic signals passing wirelessly or over wired connections.


In this specification, the term “software” is meant to include firmware residing in read-only memory or applications stored in magnetic storage, which can be read into memory for processing by a processor. Also, in some implementations, multiple software aspects of the subject disclosure can be implemented as sub-parts of a larger program while remaining distinct software aspects of the subject disclosure. In some implementations, multiple software aspects can also be implemented as separate programs. Finally, any combination of separate programs that together implement a software aspect described here is within the scope of the subject disclosure. In some implementations, the software programs, when installed to operate on one or more electronic systems, define one or more specific machine implementations that execute and perform the operations of the software programs.


A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.


These functions described above can be implemented in digital electronic circuitry, in computer software, firmware or hardware. The techniques can be implemented using one or more computer program products. Programmable processors and computers can be included in or packaged as mobile devices. The processes and logic flows can be performed by one or more programmable processors and by one or more programmable logic circuitry. General and special purpose computing devices and storage devices can be interconnected through communication networks.


Some implementations include electronic components, such as microprocessors, storage and memory that store computer program instructions in a machine-readable or computer-readable medium (alternatively referred to as computer-readable storage media, machine-readable media, or machine-readable storage media). Some examples of such computer-readable media include RAM, ROM, flash memory (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic and/or solid state hard drives, read-only and recordable Blu-Ray® discs, ultra density optical discs, any other optical or magnetic media. The computer-readable media can store a computer program that is executable by at least one processing unit, such as a microcontroller, and includes sets of instructions for performing various operations. Examples of computer programs or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.


While the above discussion primarily refers to microprocessor or multi-core processors that execute software, some implementations are performed by one or more integrated circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some implementations, such integrated circuits execute instructions that are stored on the circuit itself.


As used in this specification and any claims of this application, the terms “computer”, “server”, “processor”, and “memory” all refer to electronic or other technological devices. These terms exclude people or groups of people. For the purposes of the specification, the terms display or displaying means displaying on an electronic device. As used in this specification and any claims of this application, the terms “computer readable medium” and “computer readable media” are entirely restricted to tangible, physical objects that store information in a form that is readable by a computer. These terms exclude any wireless signals, wired download signals, and any other ephemeral signals.


Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.


It is understood that any specific order or hierarchy of steps in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged, or that all illustrated steps be performed. Some of the steps may be performed simultaneously. For example, in certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.


The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but are to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. Pronouns in the masculine (e.g., his) include the feminine and neuter gender (e.g., her and its) and vice versa. Headings and subheadings, if any, are used for convenience only and do not limit the subject disclosure.


A phrase such as an “aspect” does not imply that such aspect is essential to the subject technology or that such aspect applies to all configurations of the subject technology. A disclosure relating to an aspect may apply to all configurations, or one or more configurations. A phrase such as an aspect may refer to one or more aspects and vice versa. A phrase such as a “configuration” does not imply that such configuration is essential to the subject technology or that such configuration applies to all configurations of the subject technology. A disclosure relating to a configuration may apply to all configurations, or one or more configurations. A phrase such as a configuration may refer to one or more configurations and vice versa.


The word “exemplary” is used herein to mean “serving as an example or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.


All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims.

Claims
  • 1. A card reader comprising: a memory;a read module configured to read payment information from a financial payment card;an audio plug comprising an incoming audio channel, and a microphone channel, the audio plug configured to communicatively and mechanically couple the card reader to a mobile device;a processor coupled to the read module and the memory, wherein the processor is configured to send information to, and receive information from, the mobile device;a conditioning module coupled to the incoming audio channel, the conditioning module configured for standardizing voltage levels of analog signals received at the conditioning module from the mobile device based on a training sequence received from the mobile device, wherein the training sequence comprises a predetermined waveform; andan analog to digital converter (ADC) coupled to the processor wherein the ADC is configured for receiving audio signals from the mobile device and converting the audio signals into digital signals for transmission to the processor, and a digital to analog converter (DAC), wherein the DAC is configured for receiving digital signals from the processor and converting the digital signals into audio signals for transmission to the mobile device.
  • 2. The card reader of claim 1, wherein the ADC is further configured to receive a clock signal.
  • 3. The card reader of claim 1, wherein the incoming audio channel comprises a right audio channel and a left audio channel, and wherein the card reader further comprises a converter coupled to the audio plug, wherein the converter is configured to demodulate the audio signals received from the mobile device based on differences in signaling between the right audio channel and the left audio channel.
  • 4. The card reader of claim 1, wherein the audio signals received from the mobile device comprise data modulated with music.
  • 5. A reader for receiving payment card information at a mobile point-of-sale terminal, the reader comprising: a memory;a conditioning module configured for standardizing voltage levels of incoming analog signals received at the conditioning module from a mobile device for use in receiving communications from the mobile device;an audio plug comprising an audio bus, wherein the audio plug is configured for insertion into a headphone port of the mobile device to provide communicative coupling between the reader and the mobile device; anda microprocessor coupled to the memory, the conditioning module and the audio bus, wherein the microprocessor is configured to perform operations comprising: receiving, from the mobile device via the audio bus, a training sequence for use in determining communication parameters associated with the mobile device, the training sequence comprising a predetermined waveform; andsetting a voltage bias of the conditioning module based on the training sequence, wherein the voltage bias is used to standardize voltage levels for analog signaling received at the conditioning module from the mobile device.
  • 6. The reader of claim 5, wherein the audio bus comprises: a left audio channel and a right audio channel, andwherein the training sequence is received via one or more of the left audio channel and the right audio channel.
  • 7. The reader of claim 5, wherein the audio bus comprises a microphone channel, and wherein an acknowledgement signal is transmitted to the mobile device via the microphone channel.
  • 8. The reader of claim 5, wherein the microprocessor is further configured to perform operations comprising: providing a reference signal to the conditioning module via a digital to analog converter, wherein the reference signal provides a voltage indication for signaling received from the mobile device.
  • 9. The reader of claim 8, wherein the conditioning module is configured to clamp the signaling received from the mobile device to a predetermined voltage level.
  • 10. The reader of claim 5, wherein the analog signaling received at the conditioning module from the mobile device comprises whitened data.
  • 11. The reader of claim 5, wherein the microprocessor is coupled to the conditioning module via an analog to digital converter.
  • 12. The reader of claim 5, further comprising: a read module coupled to the microprocessor and configured to decode data on a financial payment card, and wherein the microprocessor is further configured to perform operations comprising:receiving payment information from the read module;transmitting the payment information to the mobile device, via the audio bus; andreceiving a transmission confirmation from the mobile device indicating that the payment information was successfully received.
  • 13. A method for facilitating a financial transaction using a reader, the method comprising: receiving a training sequence from a mobile device via an audio bus, wherein the training sequence comprises a predetermined waveform; andselecting a voltage bias of a conditioning module of the reader based on the training sequence, wherein the voltage bias is used to standardize analog signaling received from the mobile device.
  • 14. The method of claim 13, wherein the audio bus comprises: a left audio channel and a right audio channel, andwherein the training sequence is received via one or more of the left audio channel and the right audio channel.
  • 15. The method of claim 13, further comprising: transmitting an acknowledgement signal to the mobile device via the audio bus, in response to the training sequence, andwherein the audio bus comprises a microphone channel, and wherein the acknowledgement signal is transmitted to the mobile device via the microphone channel.
  • 16. The method of claim 15, wherein the acknowledgement signal comprises whitened data.
  • 17. The method of claim 13, further comprising: providing a reference signal to the conditioning module via a digital to analog converter, wherein the reference signal provides a voltage indication for the analog signaling received from the mobile device.
  • 18. The method of claim 13, further comprising: receiving payment information from a read head configured to decode data on a magnetic stripe; andtransmitting the payment information to the mobile device, via the audio bus.
  • 19. A non-transitory computer-readable storage medium comprising instructions stored therein, which when executed by one or more processors, cause the one or more processors to perform operations comprising: receiving a training sequence, via an analog to digital converter (ADC), wherein the training sequence is based on an analog training signal received by the ADC from a mobile device, the analog training signal comprising a predetermined waveform, and wherein the training sequence is used for configuring communication parameters of a card reader to facilitate receipt of signaling from the mobile device; andselecting a voltage bias of a conditioning module of the card reader based on the training sequence, wherein the voltage bias is used to standardize analog signaling received from the mobile device.
  • 20. The non-transitory computer-readable storage medium of claim 19, wherein the analog training signal is received via an audio bus comprising a 3.5 mm audio jack.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/231,598, filed Mar. 31, 2014, entitled, “BIDIRECTIONAL AUDIO COMMUNICATION IN READER DEVICES,” which claims the benefit of U.S. Provisional Patent Application No. 61/914,731, entitled “BIDIRECTIONAL AUDIO COMMUNICATION IN READER DEVICES”, filed on Dec. 11, 2013; both of which are hereby expressly incorporated herein by reference in their entireties.

US Referenced Citations (295)
Number Name Date Kind
3854036 Gupta et al. Dec 1974 A
4035614 Frattarola et al. Jul 1977 A
4254441 Fisher Mar 1981 A
4591937 Nakarai et al. May 1986 A
4609957 Gentet et al. Sep 1986 A
4727544 Brunner et al. Feb 1988 A
4788420 Chang et al. Nov 1988 A
4845740 Tokuyama et al. Jul 1989 A
5173597 Anglin Dec 1992 A
5266789 Anglin et al. Nov 1993 A
5406627 Thompson et al. Apr 1995 A
5434395 Storck et al. Jul 1995 A
5434400 Scherzer Jul 1995 A
5463678 Kepley, III et al. Oct 1995 A
5589855 Blumstein et al. Dec 1996 A
5603078 Henderson et al. Feb 1997 A
5616904 Fernadez Apr 1997 A
5679943 Schultz et al. Oct 1997 A
5729591 Bailey Mar 1998 A
5764742 Howard et al. Jun 1998 A
5850599 Seiderman Dec 1998 A
5878337 Joao et al. Mar 1999 A
5945654 Huang Aug 1999 A
5991749 Morrill, Jr. Nov 1999 A
D417442 Butts et al. Dec 1999 S
6006109 Shin Dec 1999 A
6021944 Arakaki Feb 2000 A
6032859 Muehlberger et al. Mar 2000 A
6061666 Do et al. May 2000 A
6129277 Grant et al. Oct 2000 A
6234389 Valliani et al. May 2001 B1
6308227 Kumar et al. Oct 2001 B1
6363139 Zurek et al. Mar 2002 B1
6400517 Murao Jun 2002 B1
6431445 Deland et al. Aug 2002 B1
6476743 Brown et al. Nov 2002 B1
6481623 Grant et al. Nov 2002 B1
6497368 Friend et al. Dec 2002 B1
6536670 Postman et al. Mar 2003 B1
6579728 Grant et al. Jun 2003 B2
D477321 Baughman Jul 2003 S
6612488 Suzuki Sep 2003 B2
6813608 Baranowski Nov 2004 B1
6832721 Fujii Dec 2004 B2
6850147 Prokoski et al. Feb 2005 B2
6868391 Hultgren Mar 2005 B1
6896182 Sakaguchi May 2005 B2
6944782 von Mueller et al. Sep 2005 B2
6979231 Shinohara Dec 2005 B2
7003316 Elias et al. Feb 2006 B1
7013149 Vetro et al. Mar 2006 B2
7149296 Brown et al. Dec 2006 B2
7163148 Durbin et al. Jan 2007 B2
7167711 Dennis Jan 2007 B1
7252232 Fernandes et al. Aug 2007 B2
7309012 von Mueller et al. Dec 2007 B2
7324836 Steenstra et al. Jan 2008 B2
7363054 Elias et al. Apr 2008 B2
D575056 Tan Aug 2008 S
7409234 Glezerman Aug 2008 B2
7424732 Matsumoto et al. Sep 2008 B2
7433452 Taylor et al. Oct 2008 B2
7505762 Onyon et al. Mar 2009 B2
7506812 von Mueller et al. Mar 2009 B2
D590828 Sherrod et al. Apr 2009 S
7520430 Stewart et al. Apr 2009 B1
7581678 Narendra et al. Sep 2009 B2
7600673 Stoutenburg et al. Oct 2009 B2
D607000 Cheng et al. Dec 2009 S
7703676 Hart et al. Apr 2010 B2
7708189 Cipriano May 2010 B1
7757953 Hart et al. Jul 2010 B2
7793834 Hachey et al. Sep 2010 B2
7869591 Nagel et al. Jan 2011 B1
7896248 Morley, Jr. Mar 2011 B2
7918394 Morley, Jr. Apr 2011 B1
7945494 Williams May 2011 B2
8011587 Johnson et al. Sep 2011 B2
8015070 Sinha et al. Sep 2011 B2
D646264 Dong Oct 2011 S
D653664 Turnbull et al. Feb 2012 S
8132670 Chen Mar 2012 B1
8231055 Wen Jul 2012 B2
8297507 Kayani Oct 2012 B2
8302860 McKelvey Nov 2012 B2
8336771 Tsai et al. Dec 2012 B2
D675618 Behar et al. Feb 2013 S
8376239 Humphrey Feb 2013 B1
D677667 Smith et al. Mar 2013 S
D679714 Smith et al. Apr 2013 S
D680537 Miller et al. Apr 2013 S
8413901 Wen Apr 2013 B2
8452004 Lee May 2013 B2
D686208 Miller et al. Jul 2013 S
8500010 Marcus et al. Aug 2013 B1
8500018 McKelvey et al. Aug 2013 B2
8560823 Aytek et al. Oct 2013 B1
8571989 Dorsey et al. Oct 2013 B2
8573486 McKelvey et al. Nov 2013 B2
8573487 McKelvey Nov 2013 B2
8573489 Dorsey et al. Nov 2013 B2
8584946 Morley, Jr. Nov 2013 B2
8584956 Wilson et al. Nov 2013 B2
8602305 Dorsey et al. Dec 2013 B2
8612352 Dorsey et al. Dec 2013 B2
8615445 Dorsey et al. Dec 2013 B2
8640953 Dorsey et al. Feb 2014 B2
D700606 Lo Mar 2014 S
8662389 Dorsey et al. Mar 2014 B2
8678277 Dorsey et al. Mar 2014 B2
D703211 Weller et al. Apr 2014 S
8701996 Dorsey et al. Apr 2014 B2
8701997 Dorsey et al. Apr 2014 B2
D706266 Rotsaert Jun 2014 S
8740072 Dorogusker Jun 2014 B1
8763900 Marcus et al. Jul 2014 B2
D711876 McWilliam et al. Aug 2014 S
8794517 Templeton et al. Aug 2014 B1
D712892 Hong et al. Sep 2014 S
8820650 Wilson et al. Sep 2014 B2
8840017 Chan et al. Sep 2014 B2
8840024 McKelvey et al. Sep 2014 B2
8870070 McKelvey et al. Oct 2014 B2
8870071 McKelvey Oct 2014 B2
8876003 McKelvey Nov 2014 B2
8910868 Wade et al. Dec 2014 B1
8931699 Wade et al. Jan 2015 B1
D724094 Blochinger et al. Mar 2015 S
D725655 Debaigue et al. Mar 2015 S
8967465 Wade et al. Mar 2015 B1
D726171 Edwards Apr 2015 S
9016570 Gluck Apr 2015 B1
9016572 Babu et al. Apr 2015 B2
D728549 Su et al. May 2015 S
D728568 Debaigue et al. May 2015 S
D731493 Mills Jun 2015 S
9063737 Babu et al. Jun 2015 B2
D740820 Templeton et al. Oct 2015 S
20010001856 Gould et al. May 2001 A1
20020002507 Hatakeyama Jan 2002 A1
20020030871 Anderson et al. Mar 2002 A1
20020073304 Marsh et al. Jun 2002 A1
20020077974 Ortiz Jun 2002 A1
20020099648 DeVoe et al. Jul 2002 A1
20020108062 Nakajima et al. Aug 2002 A1
20020165462 Westbrook et al. Nov 2002 A1
20020169541 Bouve et al. Nov 2002 A1
20020188535 Chao et al. Dec 2002 A1
20030089772 Chien May 2003 A1
20030132300 Dilday et al. Jul 2003 A1
20030135463 Brown et al. Jul 2003 A1
20030144040 Liu et al. Jul 2003 A1
20040011650 Zenhausern et al. Jan 2004 A1
20040012875 Wood Jan 2004 A1
20040033726 Kao Feb 2004 A1
20040041911 Odagiri et al. Mar 2004 A1
20040058705 Morgan et al. Mar 2004 A1
20040087339 Goldthwaite et al. May 2004 A1
20040093496 Colnot May 2004 A1
20040104268 Bailey Jun 2004 A1
20040127256 Goldthwaite et al. Jul 2004 A1
20040128256 Krouse et al. Jul 2004 A1
20040151026 Naso et al. Aug 2004 A1
20040204074 Desai Oct 2004 A1
20040204082 Abeyta Oct 2004 A1
20040230489 Goldthwaite et al. Nov 2004 A1
20040230526 Praisner Nov 2004 A1
20050009004 Xu et al. Jan 2005 A1
20050010702 Saito et al. Jan 2005 A1
20050077870 Ha et al. Apr 2005 A1
20050156037 Wurzburg Jul 2005 A1
20050156038 Wurzburg et al. Jul 2005 A1
20050194452 Nordentoft et al. Sep 2005 A1
20050209719 Beckert et al. Sep 2005 A1
20050236480 Vrotsos et al. Oct 2005 A1
20050242173 Suzuki Nov 2005 A1
20050247787 Von Mueller et al. Nov 2005 A1
20060000917 Kim et al. Jan 2006 A1
20060094481 Gullickson May 2006 A1
20060122902 Petrov et al. Jun 2006 A1
20060152276 Barksdale Jul 2006 A1
20060208066 Finn et al. Sep 2006 A1
20060219776 Finn Oct 2006 A1
20060223580 Antonio et al. Oct 2006 A1
20060234771 Shavrov Oct 2006 A1
20060273158 Suzuki Dec 2006 A1
20070063048 Havens et al. Mar 2007 A1
20070067833 Colnot Mar 2007 A1
20070100651 Ramer et al. May 2007 A1
20070124211 Smith May 2007 A1
20070155430 Cheon et al. Jul 2007 A1
20070174080 Outwater Jul 2007 A1
20070201492 Kobayashi Aug 2007 A1
20070221728 Ferro et al. Sep 2007 A1
20070244811 Tumminaro Oct 2007 A1
20070250623 Hickey et al. Oct 2007 A1
20070255620 Tumminaro et al. Nov 2007 A1
20070255643 Capuano et al. Nov 2007 A1
20070255653 Tumminaro et al. Nov 2007 A1
20080027815 Johnson et al. Jan 2008 A1
20080040265 Rackley III et al. Feb 2008 A1
20080040274 Uzo Feb 2008 A1
20080059370 Sada et al. Mar 2008 A1
20080059375 Abifaker Mar 2008 A1
20080103972 Lane May 2008 A1
20080147564 Singhal Jun 2008 A1
20080172306 Schorr et al. Jul 2008 A1
20080177662 Smith et al. Jul 2008 A1
20080208762 Arthur et al. Aug 2008 A1
20080238610 Rosenberg Oct 2008 A1
20080249939 Veenstra Oct 2008 A1
20080275779 Lakshminarayanan Nov 2008 A1
20090048978 Ginter et al. Feb 2009 A1
20090068982 Chen et al. Mar 2009 A1
20090098908 Silverbrook et al. Apr 2009 A1
20090100168 Harris Apr 2009 A1
20090104920 Moon et al. Apr 2009 A1
20090117883 Coffing et al. May 2009 A1
20090119190 Realini May 2009 A1
20090125429 Takayama May 2009 A1
20090144161 Fisher Jun 2009 A1
20090159681 Mullen et al. Jun 2009 A1
20090166422 Biskupski Jul 2009 A1
20090187492 Hammad et al. Jul 2009 A1
20100063893 Townsend Mar 2010 A1
20100108762 Morley, Jr. May 2010 A1
20100127857 Kilmurray et al. May 2010 A1
20100184479 Griffin, Jr. Jul 2010 A1
20100222000 Sauer et al. Sep 2010 A1
20100241838 Cohen et al. Sep 2010 A1
20100243732 Wallner Sep 2010 A1
20100289390 Kenney Nov 2010 A1
20110033910 Yamanaka et al. Feb 2011 A1
20110053560 Jain et al. Mar 2011 A1
20110084131 McKelvey Apr 2011 A1
20110084139 McKelvey et al. Apr 2011 A1
20110137803 Willins Jun 2011 A1
20110161235 Beenau et al. Jun 2011 A1
20110165896 Stromberg et al. Jul 2011 A1
20110174879 Morley, Jr. Jul 2011 A1
20110191196 Orr et al. Aug 2011 A1
20110198395 Chen Aug 2011 A1
20110202463 Powell Aug 2011 A1
20110258120 Weiss Oct 2011 A1
20110313880 Paul et al. Dec 2011 A1
20120008851 Pennock et al. Jan 2012 A1
20120011071 Pennock et al. Jan 2012 A1
20120012653 Johnson et al. Jan 2012 A1
20120016794 Orr et al. Jan 2012 A1
20120026018 Lin Feb 2012 A1
20120052910 Mu et al. Mar 2012 A1
20120095870 McKelvey Apr 2012 A1
20120097739 Babu et al. Apr 2012 A1
20120097740 Lamba et al. Apr 2012 A1
20120118956 Lamba et al. May 2012 A1
20120118959 Sather et al. May 2012 A1
20120118960 Sather et al. May 2012 A1
20120126005 Dorsey et al. May 2012 A1
20120126006 Dorsey et al. May 2012 A1
20120126007 Lamba et al. May 2012 A1
20120126010 Babu et al. May 2012 A1
20120126011 Lamba et al. May 2012 A1
20120126012 Lamba et al. May 2012 A1
20120126013 Sather et al. May 2012 A1
20120126014 Sather et al. May 2012 A1
20120130903 Dorsey et al. May 2012 A1
20120132712 Babu et al. May 2012 A1
20120138683 Sather et al. Jun 2012 A1
20120154561 Chari Jun 2012 A1
20120168505 Sather et al. Jul 2012 A1
20120234918 Lindsay Sep 2012 A1
20120246074 Annamalai et al. Sep 2012 A1
20120259651 Mallon et al. Oct 2012 A1
20120270528 Goodman Oct 2012 A1
20130031004 Dorsey et al. Jan 2013 A1
20130087614 Limtao et al. Apr 2013 A1
20130137367 Fisher May 2013 A1
20130200153 Dorsey et al. Aug 2013 A1
20130207481 Gobburu et al. Aug 2013 A1
20130254117 von Mueller et al. Sep 2013 A1
20130304244 Ojanpera Nov 2013 A1
20140001257 Dorsey et al. Jan 2014 A1
20140001263 Babu et al. Jan 2014 A1
20140017955 Lo et al. Jan 2014 A1
20140018016 Chang et al. Jan 2014 A1
20140061301 Cho et al. Mar 2014 A1
20140076964 Morley, Jr. Mar 2014 A1
20140089205 Kapur et al. Mar 2014 A1
20140097242 McKelvey Apr 2014 A1
20140124576 Zhou et al. May 2014 A1
20140131442 Morrow et al. May 2014 A1
20140144983 Dorsey et al. May 2014 A1
20140203082 Huh Jul 2014 A1
20150149992 Wade et al. May 2015 A1
20150199677 Wade et al. Jul 2015 A1
Foreign Referenced Citations (21)
Number Date Country
2 812 251 Apr 2012 CA
1 145 766 Oct 2001 EP
2003-108777 Apr 2003 JP
2004-078662 Mar 2004 JP
2005-063869 Mar 2005 JP
2005-242550 Sep 2005 JP
2005-269172 Sep 2005 JP
2009-199649 Sep 2009 JP
2001-313714 Nov 2011 JP
2013-518344 May 2013 JP
10-0452161 Oct 2004 KR
10-2005-0077659 Aug 2005 KR
10-2008-0039330 May 2008 KR
0165827 Sep 2001 WO
02084548 Oct 2002 WO
2007070592 Jun 2007 WO
2009128483 Oct 2009 WO
2010097711 Sep 2010 WO
2010111130 Sep 2010 WO
2010135174 Nov 2010 WO
2013009891 Jan 2013 WO
Non-Patent Literature Citations (151)
Entry
Notice of Allowance mailed Aug. 28, 2015, for U.S. Appl. No. 13/298,510, of Lamba, K. et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Sep. 11, 2014, for U.S. Appl. No. 13/298,510, of Lamba, K. et al., filed Nov. 17, 2011.
Final Office Action mailed May 6, 2015, for U.S. Appl. No. 13/298,510, of Lamba, K. et al., filed Nov. 17, 2011.
Final Office Action mailed Jul. 9, 2012, for U.S. Appl. No. 13/005,822, of McKelvey, J., et al., filed Jan. 13, 2011.
Non-Final Office Action mailed Jun. 18, 2013, for U.S. Appl. No. 13/005,822, of McKelvey, J., et al., filed Jan. 13, 2011.
Non-Final Office Action mailed Dec. 10, 2013, for U.S. Appl. No. 13/005,822 of McKelvey, J., et al., filed Jan. 13, 2011.
Non-Final Office Action mailed Oct. 7, 2011, for U.S. Appl. No. 13/043,258, of McKelvey, J., filed Mar. 8, 2011.
Final Office Action mailed Jul. 13, 2012, for U.S. Appl. No. 13/043,258, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Dec. 11, 2013, for U.S. Appl. No. 13/043,258, of McKelvey, J., filed Mar. 8, 2011.
Notice of Allowance mailed Jul. 1, 2014, for U.S. Appl. No. 13/043,258, of McKelvey, J., filed Mar. 8, 2011.
Advisory Action mailed Aug. 15, 2012, for U.S. Appl. No. 13/043,258, of McKelvey, J., filed Mar. 8, 2011.
Advisory Action mailed Aug. 17, 2012, for U.S. Appl. No. 13/005,822, of McKelvey, J.,et al., filed Jan. 13, 2011.
Notice of Allowance mailed Jun. 24, 2014, for U.S. Appl. No. 13/005,822, of McKelvey, J., et al., filed Jan. 13, 2011.
Non-Final Office Action mailed Oct. 7, 2014, for U.S. Appl. No. 13/298,534, of Lamba, K., et al., filed Nov. 17, 2011.
Final Office Action mailed Apr. 8, 2015, for U.S. Appl. No. 131298,534, of Lamba, K., et al., filed Nov. 17, 2011.
Final Office Action mailed Sep. 6, 2013, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed Nov. 17, 2011.
Advisory Action mailed Oct. 21, 2013, for U.S. Appl. No. 13/298,560 of Lamba K. et al., filed Nov. 17, 2011.
Final Office Action mailed Aug. 15, 2013, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Advisory Action mailed Nov. 8, 2013, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Feb. 24, 2014, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Notice of Allowance mailed Jul. 15, 2014, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Jul. 22, 2014, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed 17 2011.
Final Office Action mailed Jan. 28, 2015, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Oct. 11, 2011, for U.S. Appl. No. 13/043,203, of McKelvey. J., et al., filed Mar. 8, 2011.
Final Office Action mailed Jul. 6, 2012, for U.S. Appl. No. 13/043,203, of McKelvey. J., et al., filed Mar. 8, 2011.
Non-Final office Action mailed Oct. 11, 2011, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Advisory Action mailed Aug. 1, 2012, for U.S. Appl. No. 13/043,203, of McKelvey, J., filed Mar. 8, 2011.
Advisory Action mailed Aug. 16, 2012, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Apr. 29, 2013, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Non-Final office Action mailed Apr. 30, 2013, for U.S. Appl. No. 13/043,203, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed May 28, 2013, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed Nov. 17, 2011.
Notice of Allowance mailed Jul. 9, 2013, for U.S. Appl. No. 13/043,203, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Jul. 6, 2015, for U.S. Appl. No. 13/298,534, of Lamba K. et al., filed Nov. 17, 2011.
Advisory Action mailed Apr. 9, 2015, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed Nov. 17, 2011.
Final office Action mailed Jul. 9, 2012, for U.S. Appl. No. 13/043,263, of McKelvey, J., filed Mar. 8, 2011.
Non-Final Office Action mailed Jul. 16, 2015, for U.S. Appl. No. 13/298,560, of Lamba K. et al., filed Nov. 17, 2011.
Application for Registration of an Industrial Design Examiner's Report for Canadian Design Application No. 159528, mailed Jun. 11, 2015.
English-language translation of Notice of Reasons for Rejection for Japanese Application No. 2014-0255525, mailed Mar. 31, 2015.
Certificate of Registration of Design for Indian Design Application No. 267386 mailed Nov. 14, 2014 (Registration No. 39149).
Non-Final Office Action mailed Jul. 27, 2015, for U.S. Appl. No. 29/493,212, of Edwards, T., et al., filed Jun. 6, 2014.
English-language translation of Decision of Final Rejection for Japanese Patent Application No. 2013-533897, mailed Feb. 23, 2015.
English-language translation of Office Action for Japanese Patent Application No. 2013-533897, mailed Jun. 5, 2014.
English-language translation of Search Report for Japanese Patent Application No. 2013-533897, mailed Apr. 14, 2014.
Certificate of Design Registration for European Patent Application No. 002578674, mailed Nov. 14, 2014 (Registration No. 002578674-0001).
Advisory Action mailed Aug. 24, 2012, for U.S. Appl. No. 13/010,976, of Babu, A. R., et al., filed Jan. 21, 2011.
Non-Final Office Action mailed Aug. 15, 2014, for U.S. Appl. No. 13/010,976, of Babu, A. R., et al., filed Jan. 21, 2011.
Notice of Allowance mailed Dec. 24, 2014, for U.S. Appl. No. 13/010,976, of Babu, A. R., et al., filed Jan. 21, 2011.
Non-Final Office Action mailed Apr. 2, 2014, for U.S. Appl. No. 14/012,655, of McKelvey, J., filed Aug. 28, 2013.
Final Office Action mailed Aug. 15, 2014, for U.S. Appl. No. 14/012,655, of McKelvey, J., filed Aug. 28, 2013.
Non-Final Office Action mailed Jan. 20, 2015, for U.S. Appl. No. 14/012,655, of McKelvey, J., filed Aug. 28, 2013.
Notice of Allowance mailed Sep. 1, 2015, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Apr. 29, 2013, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Final Office Action mailed Aug. 22, 2013, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Advisory Action mailed Oct. 22, 2013, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Aug. 21, 2014, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Final Office Action mailed Mar. 18, 2015, for U.S. Appl. No. 13/298,487, of Babu, A., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Feb. 20, 2015, for U.S. Appl. No. 14/512,104, of Templeton, T., et al., filed Oct. 10, 2014.
International Search Report and Written Opinion for PCT Application No. PCT/US2010/052483, mailed Jun. 10, 2011.
International Search Report and Written Opinion for PCT Application No. PCT/US2011/055386, mailed Feb. 22, 2012.
European Search Report and Opinion for European Application No. 11833172.7, mailed Apr. 22, 2014.
International Search Report and Written Opinion for PCT Application No. PCT/US2014/069788, mailed May 14, 2015.
International Search Report and Written Opinion for PCT Application No. PCT/US2012/064782, mailed Feb. 26, 2013.
International Search Report and Written Opinion for PCT Application No. PCT/US2010/052481, mailed Jun. 23, 2011.
International Search Report and Written Opinion for PCT Application No. PCT/US2014/067074, mailed Mar. 15, 2015.
Examination Report for Canadian Application No. 2,812,594, mailed on Feb. 24, 2015.
Examination Report No. 1 for Australian Application No. 201415781, mailed on Feb. 23, 2015 (Registration No. 359005).
“Review: Square, Inc. Square Credit Card Reader (2013),” iLounge, Retrieved from the Internet URL: http://www.ilounge.com/index.php/reviews/entry/square-inc.-square-credit-card-reader-2013/, on Jan. 16, 2014, pp. 3.
“TUAW The Unofficial Apple Weblog, Square credit card reader loses weight, gains accuracy”, Retrieved from the Internet URL: http://www.tuaw.com/2013/12/09/square-credit-card—reader-loses-weight-gains-accuracy/, on Dec. 9, 2013, p. 1.
Examination Report No. 2 for Australian Application No. 201415781, mailed Aug. 13, 2015 (Registration No. 359005).
Ryan, P., “Plug and Pay: A Gallery of 26 Mobile Card Readers,” Aug. 20, 2013, Retrieved from the Internet URL: http://bankinnovation.net/2013/08/plug-and-pay-a-gallery-of-26-mobile-card-readers/, on Feb. 19, 2015, pp. 1-12.
Notification of Registration of a Design for Australian Application No. 201415781, mailed on Nov. 27, 2014 (Registration No. 359005).
First Examination Report for Indian Design Application No. 267386, mailed Feb. 5, 2015.
Non-Final Office Action mailed Apr. 25, 2013, for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Final Office Action mailed Sep. 17, 2013, for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Advisory Action mailed Oct. 22, 2013 for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Jul. 17, 2014, for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Final Office Action mailed Feb. 4, 2015, for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Notice of Allowance mailed Jun. 22, 2015, for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Jun. 6, 2014, for U.S. Appl. No. 14/231,598, of Wade, J., et al., filed Mar. 31, 2014.
Non-Final Office Action mailed Apr. 10, 2015, for U.S. Appl. No. 14/189,997, of Lamfalusi, M., et al., filed Feb. 25, 2014.
Notice of Allowance mailed Nov. 25, 2014, for U.S. Appl. No. 14/231,598, of Claude, J.B., et al., filed Mar. 31, 2014.
Non-Final Office Action mailed Jun. 22, 2015, for U.S. Appl. No. 14/322,815, of Edwards, T., filed Jul. 2, 2014.
Notice of Allowance mailed Aug. 1, 2014, for U.S. Appl. No. 14/203,463, of Wade, J., et al., filed Mar. 10, 2014.
Notice of Allowance mailed Aug. 27, 2015, for U.S. Appl. No. 13/298,501, of Babu, A., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Sep. 11, 2014, for U.S. Appl. No. 13/298,501, of Babu, A., et al., filed Nov. 17, 2011.
Final Office Action mailed May 6, 2015, for U.S. Appl. No. 13/298,501, of Babu, A., et al., filed Nov. 17, 2011.
Notice of Allowance mailed Oct. 17, 2014, for U.S. Appl. No. 14/220,967, of Wade, J., et al., filed Mar. 20, 2014.
Notice of Allowance mailed Dec. 18, 2014, for U.S. Appl. No. 14/220,967, of Wade, J., et al., filed Mar. 20, 2014.
Notice of Allowance mailed May 19, 2015, for U.S. Appl. No. 14/620,765, of Wade, J., et al., filed Feb. 12, 2015.
Non-Final Office Action mailed May 26, 2015, for U.S. Appl. No. 14/551,681, of Wade, J., et al., filed Nov. 24, 2014.
Notice of Allowance mailed Jun. 10, 2014, for U.S. Appl. No. 29/491,147, of Templeton T., et al., filed May 16, 2014.
Non-Final Office Action mailed Sep. 11, 2014, for U.S. Appl. No. 13/298,506, of Lamba, K., et al., filed Nov. 17, 2011.
Final Office Action mailed May 6, 2015, for U.S. Appl. No. 13/298,506, of Lamba, K., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Sep. 30, 2011, for U.S. Appl. No. 13/005,822, of McKelvey, J., et al., filed Jan. 13, 2011.
Non-Final Office Action mailed Nov. 21, 2013, for U.S. Appl. No. 14/052,009, of Wilson, M., et al., filed Oct. 11, 2013.
Non-Final Office Action mailed Jul. 19, 2012, for U.S. Appl. No. 12/903,758, of Wilson, M., et al., filed Oct. 13, 2010.
“Reading magnetic cards (almost) for free,” Lekernel's Scrapbook, (“Lekernel”), Jan. 26, 2009, Retrieved from the Internet URL: http://lekernel.net/blog/?p=12, on May 5, 2011, pp. 1-2.
“MSR500EX (Mini123EX) Portable Magnetic Stripe Card Reader,” Tyner, Apr. 27, 2007, Retrieved from the Internet URL: http://www.tyner.com/magnetic/msr500ex.htm, on Apr. 22, 2011, pp. 1-3.
Padilla, L., “Turning your mobile into a magnetic stripe reader,” Retrieved from the Internet URL: http://www.gae.ucm.es/˜padilla/extrawork/mobilesoundtrack.html, on Feb. 7, 2011, pp. 1-4.
Padilla, L., “Magnetic stripe reader circuit,” Jan. 28, 1997, Retrieved from the Internet URL: http://www.gae.ucm.es/˜padilla/extraworkImagamp.html, on May 5, 2011, pp. 1-7.
Padilla, L. “The simplest magnetic stripe reader,” Jan. 27, 2003, Retrieved from the Internet URL: www.gae.ucm.esi˜padilla/extrawork/soundtrack.html, on Dec. 21, 2009, pp. 1-5.
“Travel industry targeted for Palm PDA card reader,” Retrieved from the Internet URL: http://www.m-travel.com/news/2001/08/travel—industry.html, on Apr. 19, 2011, pp. 1-2.
“Semtek to target healthcare with HandEra PDAs and PDA swipe card reader,” Aug. 29, 2001, Retrieved from the Internet URL: http://www.pdacortex.com/semtek.htm, on Apr. 19, 2011, pp. 1-2.
“Semtek 3913 Insert Magnetic Card Reader 20 Pin Serial RS232,” Product description, RecycledGoods.com, Retrieved from the Internet URL: http://www.recycledgoods.com/products/Semtek-3913-Insert-Magnetic-Card-Reader-20-Pi . . ., on Apr. 19, 2011, pp. 1-3.
Credit Card Swiper and Reader for iPhone, iPad, Blackberry, Android and more, Retrieved from the Internet URL: http://hubpages.com/hub/Credit-Card-Swiper-and-Reader-for-iPhone-iPad-Blackberry-An . . ., on Apr. 20, 2011, pp. 1-2.
Titlow, J.P., “ROAMpay is like Square for Blackberry (Plus Android, iOS and Desktops),” Dec. 1, 2010, Retrieved from the Internet URL: http://www.readwriteweb.com/biz/2010/12/roampay-is-like-square-for-bla.php, on Apr. 20, 2011, pp. 1-12.
Veneziani, V., “Use a cellphone as a magnetic card reader,” Apr. 15, 2005, Retrieved from the Internet URL: http://hackaday.com/2005/04/15/use a-cellphone-as-a-magnetic-card . . ., on Feb. 7, 2011, pp. 1-10.
Buttell, A.E., “Merchants eye mobile phones to transact card payments,” Feb. 3, 2010, Retrieved from the Internet URL: http://www.merchantaccountguide.com/merchant-account-news/cell-phone-credit-card-mer . . ., on Feb. 8, 2011, pp. 1-3.
“USB Magnetic Stripe Credit/Card Track-2 Reader and Writer (75/210BPI),” Deal Extreme (dealextreme.com), Nov. 15, 2008, Retrieved from the Internet URL: http://www.dealextreme.com/p/usb-magnetic-stripe-credit-debit-card-track-2-reader-and-wr . . ., on Feb. 8, 2011, pp. 1-3.
“Mophie Marketplace Magnetic Strip Reader/Case for iPhone 3G & 3GS- Grey,” J&R (JR.com), Retrieved from the Internet URL: http://www.jr.com/mophie/pe/MPE—MPIP3GBLK/, on Feb. 8, 2011, pp. 1-1.
“Barcode scanner and Magnetic Stripe Reader (MSR) for Pocke..,” Tom's Hardware (tomshardware.com), Retrieved from the Internet URL: http://www.tomshardware.com/forum/24068-36-barcode-scanner-magnetic-stripe-reader-po . . ., on Feb. 8, 2011, pp. 1-2.
“A Magnetic Stripe Reader—Read Credit Cards & Driver Licences!,” Articlesbase (articlesbase.com), Sep. 7, 2009, Retrieved from the Internet URL: http://www.articlesbase.com/electronics-articles/a-magnetic-stripe-reader-read-credit-cards-. . ., on Feb. 8, 2011, pp. 1-3.
Jones, R., “U.S. Credit Cards to get a high-tech makeover,” Oct. 22, 2010, Retrieved from the Internet URL: http://lifeine.today.com/—news/2010/10/22/5334208-us-credit-cards-to-get-a-high-tech-mak . . ., on Feb. 8, 2011, pp. 1-8.
“Arduino magnetic stripe decoder,” Instructables, Retrieved from the Internet URL: http://www.instructables.com/id/Arduino-magneticstripe-decorder/, on Feb. 8, 2011, pp. 1-5.
“Magnetic Stripe Reader (MSR) MSR7000-100R,” Motorola Solutions, Retrieved from the Internet URL: http://www.motorola.com/business/US-EN/MSR7000-100R—US-EN.do?vgnextoid=164fc3 . . ., on Feb. 8, 2011, pp. 1-1.
“Pay@PC,” Retrieved from the Internet URL: http://www.merchantanywhere.com/PAY—AT—PCT@PC.htm, on Feb. 11, 2011, pp. 1-2.
“Get paid on the spot from your mobile phone,” Retrieved from the Internet URL: http://payments.intuit.com/products/basic-payment-solutions/mobile-credit-card-processin. . ., on Feb. 11, 2011, pp. 1-3.
“Touch-Pay Wireless Credit Card Processing,” MerchantSeek, Retrieved from the Internet URL: http://www.merchantseek.com/wireless-credit-card-processing.htm, on Feb. 11, 2011, pp. 1-5.
“Announcement: Semtek Introduces Side Swipe II Card Reader for Wireless Devices,” Brighthand, Retrieved from the Internet URL: http://forum.brighthand.com/pdas-handhelds/173285-announcement-semtek-introduces-sid . . . , on Apr. 19, 2011, pp. 1-2.
Grandison, K., “vTerminal Credit Card Processing App for AuthorizeNet and PayPal Payflow Pro for Curve 8350 8500 8900 and Bold 9000,” Retrieved from the Internet URL: http://www.4blackberry.net/tag/business-tools/vterminal-credit-card-processing-app-for-authorizenet-and-paypal-payflow-pro-for-curve-8350-8500-890-download-2075.html, on Mar. 30, 2015, pp. 1-4.
Harris, A., “Magnetic Stripe Card Spoofer,” Aug. 4, 2008, Retrieved from the Internet URL: http://hackaday.com/2008/08/04/magnetic-stripe-card-spoofer/, on Apr. 25, 2011, pp. 1-11.
“Headphone Jack (3.5mm),” Retrieved from the Internet URL: http://www.phonescoop.com/glossary/term.php?gid=440, on May 5, 2011, pp. 1-1.
“2.5mm Headset Jack,” Retrieved from the Internet URL: http://www.phonescoop.com/glossary/term.php?gid=360, on May 5, 2011, pp. 1-1.
“Reference Designations for Electrical and Electronics Parts and Equipment,” Engineering Drawing and Related Documentation Practices, ASME Y14.44-2008, The American Society of Mechanical Engineers, Nov. 21, 2008, pp. 1-31.
Acidus, “Mag-stripe Interfacing—A Lost Art,” Retrieved from the Internet URL: http://www.scribd.com/doc/18236182/Magstripe-Interfacing#open—. . . , on Feb. 7, 2011, pp. 1-4.
“Mag-stripe readers The hunt for a homebrew mag-stripe reader that'll work with modern,” Jan. 16, 2009, Retrieved from the Internet URL: http://www.hak5.org/forums/index.php?showtopic=11563&st=20, on Apr. 25, 2011, pp. 1-6.
Kuo, Y-S et al., “Hijacking Power and Bandwidth from the Mobile Phone's Audio Interface,” Proceedings of the First ACM Symposium on Computing for Development, (DEV'10), Dec. 17, 2010, pp. 1-10.
Website: www.alexwinston.com, Aug. 31, 2009, pp. 1-5.
“Magnetic Card Reader,” lekernel.net˜scrapbook, Retrieved from the Internet URL: http://lekernel.net/scrapbook/old/cardreader.html, on Apr. 25, 2011, pp. 1-4.
“Magnetic stripe reader/writer,” Retrieved from the Internet URL: http://www.gae.ucm.es/-padilla/extrawork/stripe.html, on Dec. 21, 2009, pp. 1-2.
Lucks, S., “Two-Pass Authenticated Encryption Faster than Generic Composition,” H. Gilbert and H. Handschuh (Eds.): FSE 2005, LNCS 3557, © International Association for Cryptologic Research 2005, pp. 284-298.
Bauer, G.R. et al., “Comparing Block Cipher Modes of Operation on MICAz Sensor Nodes,” 17th Euromicro International Conference on Parallel, Distributed and Network-based Processing, 2009, Feb. 18-20, 2009, pp. 371-378.
European Search Report and Opinion for European Patent Application No. 11 786 731.7, mailed Mar. 28, 2014.
Office Action for European Patent Application No. 11 786 731.7, mailed Jul. 16, 2015.
Non-Final Office Action mailed Sep. 30, 2011, for U.S. Appl. No. 12/903,753, of McKelvey, J., et al., filed Oct. 13, 2010.
Final Office Action mailed Jul. 6, 2012, for U.S. Appl. No. 12/903,753, of McKelvey, J., et al., filed Oct. 13, 2010.
Non-Final Office Action mailed Jul. 8, 2013, for U.S. Appl. No. 12/903,753, of McKelvey, J., et al., filed Oct. 13, 2010.
Notice of Allowance mailed Oct. 10, 2013, for U.S. Appl. No. 12/903,753, of McKelvey, J., filed Oct. 13, 2010.
Final Office Action mailed Apr. 24, 2013, for U.S. Appl. No. 12/903,758, of Wilson, M., et al., filed Oct. 13, 2010.
Notice of Allowance mailed Aug. 6, 2013, for U.S. Appl. No. 12/903,758, of Wilson, M., et al., filed Oct. 13, 2010.
Notice of Allowance mailed Apr. 4, 2014, for U.S. Appl. No. 14/052,009, of Wilson, M., et al., filed Oct. 11, 2013.
Notice of Allowance mailed Jul. 30, 2014, for U.S. Appl. No. 14/052,009, of Wilson, M., et al., filed Oct. 11, 2013.
Non-Final Office Action mailed Sep. 30, 2011, for U.S. Appl. No. 13/010,976, of Babu, A. R., et al., filed Jan. 21, 2011.
Final Office Action mailed Jun. 12, 2012, for U.S. Appl. No. 13/010,976, of Babu, A. R., et al., filed Jan. 21, 2011.
Advisory Action mailed Sep. 11, 2015 for U.S. Appl. No. 13/298,506, of Lamba, K., et al., filed Nov. 17, 2011.
Notice of Allowance mailed Sep. 16, 2015 for U.S. Appl. No. 14/551,681, of Wade, J., et al., filed Nov. 24, 2014.
Notice of Allowance mailed Oct. 5, 2015 for U.S. Appl. No. 14/322,815, of Edwards, T., filed Jul. 2, 2014.
Notice of Allowance mailed Oct. 6, 2015 for U.S. Appl. No. 13/298,491, of Lamba, K., et al., filed Nov. 17, 2011.
Notice of Allowance mailed Oct. 7, 2015 for U.S. Appl. No. 13/298,510, of Lamba, K. et al., filed Nov. 17, 2011.
Notice of Allowance mailed Oct. 8, 2015 for U.S. Appl. No. 13/298,487, of Lamba, K., et al., filed Nov. 17, 2011.
Non-Final Office Action mailed Oct. 29, 2015 for U.S. Appl. No. 14/512,104, of Templeton, T., et al., filed Oct. 10, 2014.
Related Publications (1)
Number Date Country
20150161419 A1 Jun 2015 US
Provisional Applications (1)
Number Date Country
61914731 Dec 2013 US
Continuations (1)
Number Date Country
Parent 14231598 Mar 2014 US
Child 14578107 US