Mobile telephones phones are increasingly being used for functions beyond making telephone calls. For example, so-called “smart phones” are commonly used to access Internet applications (e.g., web services) to enable users to conduct transactions, such as purchasing or selling goods or services, or to participate in social networking. Smart phones equipped with short-range radio technology, such as radio frequency identification (“RFID”), near field communication (“NFC”), WiFI Direct, and/or Bluetooth, are being used increasingly to conduct transactions with other similarly equipped smart phones. However, many smart phones may not be equipped with short range radio technology. For example, the technology may be expensive, or it may be resource-intensive such that it drains a battery. Additionally, smart phones that are only able to communicate using close-range radio technology may not be able to facilitate transactions that require participation from other entities, such as bank and credit card transactions.
Embodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings. To facilitate this description, like reference numerals designate like structural elements. Embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
In the following detailed description, reference is made to the accompanying drawings which form a part hereof wherein like numerals designate like parts throughout, and in which is shown by way of illustration embodiments that may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of embodiments is defined by the appended claims and their equivalents.
Various operations may be described as multiple discrete actions or operations in turn, in a manner that is most helpful in understanding the claimed subject matter. However, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations may not be performed in the order of presentation. Operations described may be performed in a different order than the described embodiment. Various additional operations may be performed and/or described operations may be omitted in additional embodiments.
For the purposes of the present disclosure, the phrase “A and/or B” means (A), (B), or (A and B). For the purposes of the present disclosure, the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C).
The description may use the phrases “in an embodiment,” or “in embodiments,” which may each refer to one or more of the same or different embodiments. Furthermore, the terms “comprising,” “including,” “having,” and the like, as used with respect to embodiments of the present disclosure, are synonymous.
As used herein, the term “module” may refer to, be part of, or include an Application Specific Integrated Circuit (“ASIC”), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Referring now to
As used herein, an “executable procedure” may include any number of states, transitions between states, actions, or other components that may collectively form a state machine. Non-limiting examples of executable procedure may include selectively conducting transactions with other computing devices, establishing relationships between computing devices and/or users thereof, buying or selling goods or services, and so forth.
For example, in
In various embodiments, a back end server 120 may be provided to facilitate transactions between computing devices such as first computing device 102 and/or second computing device 108. In various embodiments, back end server 120 may facilitate various aspects of a transaction. For example, in various embodiments, back end server 120 may facilitate authentication of a user identity, e.g., to enable withdrawal of funds from a bank account associated with the user. Back end server 120 may additionally or alternatively facilitate other security aspects of a transaction, such as ensuring that transmitted data is kept private, e.g., using cryptography, inviting devices to join the transaction, and so forth.
While back end server 120 is depicted as a single computing device in
In various embodiments, as part of determining its context, first computing device 102 may be configured to determine that first computing device 102 and/or another computing device is/are suitably located to engage in a transaction. For example, in
In various embodiments, upon determining that first computing device 102 and/or second computing device 108 are suitably located, first computing device 102 may be configured to operate a particular executable procedure of a plurality of executable procedures. For example, first computing device 102 may selectively facilitate a transaction, e.g., through back end server 120, with second computing device 108. The selective facilitation of the transaction may be based on various information collected and/or provided by first computing device 102, second computing device 108 and/or back end server 120. For example, in various embodiments, the transaction may be selectively conducted based on a context of first computing device 102, first user 104, second computing device 108, and/or second user 110. Additionally, the transaction may be selectively conducted based on the detected generic gesture (e.g., a wave) made using first computing device 102 and/or second computing device 108, as well as a connotation that is associated with the detected generic gesture based on the determined context of first computing device 102, first user 108, second computing device 108, and/or second user 110.
In various embodiments, a generic gesture may have multiple connotations (hence the name, “generic”), depending on a context of first computing device 102, and/or first user 104. For example, if first user 104 waves first computing device 102 in a first location, that wave may have a different connotation than if first user 104 waves first computing device 102 in a second location.
In various embodiments, first computing device 102 may include a plurality of executable procedures that it may selectively operate in response to a detected generic gesture and a determined context of first computing device 102 and/or first user 104. For example, one executable procedure may be operated by first computing device 102 if waved by first user 104 in a coffee shop (e.g., first user 104 may be authorizing the coffee shop to deduct funds from the user's bank account in exchange for coffee). Another executable procedure may be operated by first computing device 102 if waved by first user 104 at a rideshare parking lot (e.g., first user 104 may be attempting to form a rideshare relationship with second user 110).
In various embodiments where the executable procedure selectively operated by first computing device 102 is to selectively conduct a transaction with second computing device 108, a first line of network communication 124 (direct or indirect) may be established between first computing device 102 and back end server 120. Likewise, in various embodiments, a second line of network communication 126 (direct or indirect) may be established between second computing device 108 and back end server 120. Using these lines of network communication, first computing device 102 and second computing device 108 may engage in a variety of transactions, including but not limited to exchange of goods or services, alteration of a relationship between first user 104 and second user 110, and so forth.
Relationships between users may be identified in various ways. In various embodiments, relationships may be identified from a “social graph,” such as from friends and/or acquaintances identified in a social network. Users connected via a social graph may be connected to each others' identities, e.g., because they know one another. Additionally or alternatively, relationships between users may be identified from an “interest graph.” An interest graph may be a network of users who share one or more interests or affiliations, but who do not necessarily know each other personally. A non-limiting example of an interest graph may be a rideshare network of users who are willing and/or are capable of participating in ride sharing, e.g., to address traffic congestion and/or save on fuel costs.
In various embodiments, first computing device 102 may be configured to determine and/or obtain contextual information about first computing device 102 and/or first user 104. In various embodiments, computing device 102 may determine and/or obtain contextual information from “soft” data sources 232 and/or “hard” data sources 234. Selective operation logic 230 may be configured to selectively operate one or more of plurality of executable procedures 231 based on contextual information obtained from soft data sources 232 and/or hard data sources 234 (including a detected gesture).
In various embodiments, soft data sources 232 may include any resource, typically but not necessarily a network resource, that includes contextual information about first user 104. For example, in
In various embodiments, hard data sources 234 may include any system resource of computing device 102 that provides contextual information about first computing device 102 and/or data related to a detected gesture. For example, in
In various embodiments, first computing device 102 may include a library 264 of predefined generic gestures. In various embodiments, first computing device 102 may be configured to match a gesture detected using, e.g., accelerometer 254 and/or gyroscope 258 to a generic gesture of the library of generic gestures 264. Based on this determined gesture and/or an associated connotation, selective operation logic 230 may selectively operate an executable procedure of plurality of executable procedures 231.
Various generic gestures having potentially multiple context-dependent connotations may be included in library 264. For example, in
In some embodiments, first user 104 may define a signature gesture that may be used in particular contexts to enable authentication of first user 104 and/or provide an additional security layer. First user 104 may then move computing device 102 in such a predefined manner, e.g., to authenticate the user's identity, much as first user 104 may use a password for authentication.
As noted above, a context of first computing device 102 may include user preferences 244. For example, first user 104 may deliberately configure user preferences 244 of first computing device 102 so that a particular gesture causes a first executable procedure to be operated. Later, first user 104 may reconfigure user preferences 244 of first computing device 102 so that the same gesture will now cause a second executable procedure to occur.
A context may additionally or alternatively include one or more states of first computing device 102. For example, if first computing device 102 is in a first state (e.g., has a map program open) and detects a generic gesture, it may operate a particular executable procedure (e.g., reorient the map, change map viewpoint, etc.). However, if first computing device 102 is in a second state (e.g., map program not open), detection of the same generic gesture may cause a different executable procedure (e.g., unrelated to the map program). Other non-limiting examples of states that may cause first computing device 102 to selectively operate different executable procedures on detection of a generic gesture include, but are not limited to, battery power, temperature (of first computing device 102 or its environment), computing load of first computing device 102, wireless signal strength, channel condition, and so forth.
In some embodiments, first computing device 102 may be configured to track a shape or path in the air created by first user 104 by moving first computing device. For example, first user 104 could, with first computing device 102 in hand, mimic drawing of a particular letter or sequence of letters (e.g., to form a word such as a password) in the air. First computing device 12 may detect this movement, e.g., using accelerometer 254 and/or gyroscope 258, and may match it to a corresponding movement and associated connotation (e.g., a letter).
As an example, suppose first user 104 is a member of a rideshare social network and seeks a ride to a particular location. First user 104 may enter an area defined by a geofence, such as a rideshare parking lot commonly used by members of the rideshare social network, to search for another member going to the same or a similar location, or in the same direction. Second user 110 may also be a member of the rideshare social network. Second user 110 may drive into the rideshare parking lot seeking other members to join second user 110 (e.g., to split fuel costs).
When first user 102 sees second user 110 pull in, first user 102 may move first computing device 102 to form a gesture. Contemporaneously with detecting the gesture, first computing device 102 may determine a context of first computing device 102, second computing device 108, first user 104, and/or second user 110. For example, first computing device 102 may determine that it is located within geofence 106 and/or that second computing device 108 is also located within geofence 106). First computing device 102 may also consult with a soft data source 232, such as social graph 240 or interest graph 242, to confirm that the user associated with second computing device 108, i.e., second user 110, is a member of the rideshare social network. Once the context is determined, first computing device 102 may selectively operate an executable procedure of a plurality of executable procedures that includes authorization and/or authentication of a transaction between first computing device 102 and second computing device 108. For example, first computing device 102 may establish a ridesharing agreement between first user 104 and second user 110.
Selective conduction of a transaction between computing devices may not always be based purely on proximity of the computing devices to each other. In some embodiments, selective conduction of a transaction between computing devices may be based on an absolute location of one or more of the computing devices. For example, in some embodiments, first computing device 102 may be configured to determine its absolute location, e.g., using GPS unit 260. Based on this location, first computing device 102 may be configured to selectively conduct a transaction with another computing device. An example of this is shown in
In
In some embodiments, first computing device 102 may be configured to determine a type of venue 370, e.g., based on its location. For example, using GPS coordinates, first computing device 102 may determine that venue 370 is a particular coffee house, or one of a chain of coffee houses. Based at least in part on this determined context, and on a gesture first user 104 makes using first computing device 102, first computing device 102 may selectively operate an executable procedure of a plurality of executable procedures 271.
For instance, in
For instance, back end server 320 may track “rewards” that customers have accumulated, e.g., via repeated purchase of products. First computing device 102 may determine a context of first user 104, including that first user is a member of the rewards program, and/or that first user 104 has accumulated sufficient rewards to earn a prize (e.g., based on transaction history 236). First computing device 102 may transmit this contextual information to back end server 320. Back end server 320 may instruct second computing device 308 to provide the prize to first user 104, e.g., by dispensing the prize or authorizing its release.
At operation 402, first computing device 102 may detect a gesture made by first user 104 using first computing device 102. For example, first computing device 102 may obtain data from accelerometer 254 and/or gyroscope 258. At operation 404, first computing device 102 may match gesture detected at operation 402 to a generic gesture from library 264.
At operation 406, a context of first computing device 102 and/or first user 104 may be determined. For example, at operation 408, a location of first computing device 102 may be determined, e.g., using GPS 260. As another example, at operation 410, first computing device 102 may determine, e.g., using GPS 260 or other components, whether it is within a geofence. As noted above, such a geofence may be defined by first computing device 102 itself or by another computing device. At operation 412, first computing device 102 may determine whether a remote computing device, such as second computing device 108, is also within the same geofence. Myriad other contextual information may be determined at operation 406. For example, first computing device may determine whether a remote computing device is within a particular proximity (e.g., within Bluetooth or NFC range), connected to a particular wireless access point (e.g., Wifi access point at a particular venue), whether first user 104 is a member of a particular rewards program, whether first user 104 has a social graph 240 relationship with a user associated with another computing device, and so forth.
At operation 414, the gesture detected and matched at operations 402-404 may be associated with a connotation, based at least in part on the context determined at operation 406. For example, if first computing device 102 detects that first user 104 is waving first computing device 102 within a predetermined proximity (e.g., in the same geofence) of second computing device 108, then the connotation may be that first user 104 authorizes a transaction between first computing device 102 and second computing device 108, or that a relationship (e.g., in a social graph 240 and/or interest graph 242) should be formed between first user 104 and a user associated with the other computing device.
At operation 416, first computing device may selectively operate one or more of plurality of executable procedures 231 based on the gesture detected at operation 402 and the context determined at operation 406. For example, at operation 418, first computing device 102 may selectively conduct a transaction with a remote computing device such as second computing device 108. Myriad other executable procedures may be selectively operated at operation 416.
For example, in some embodiments, first computing device 102 may, based on the detected gesture, its location and/or a context of first computing device 102 or first user 104, disclose (e.g., broadcast) its availability to enter into a transactions, e.g., a ridesharing agreement. Suppose first user 104 is a member of a rideshare club and carries first computing device 102 at or near a predefined rideshare meeting place. First computing device 102 may broadcast its context and/or availability to enter into a rideshare agreement. In such embodiments, first user 104 may then initiate or confirm willingness to enter into a transaction with another computing device in the area by making a gesture with first computing device 102, and/or by watching for a gesture made by another user using another computing device.
First computing device 102 may detect a gesture made using another computing device, such as second computing device 108, in various ways. For instance, first computing device 102 may cause camera 262 to capture one or more digital images of the second computing device 108. Using various techniques, such as image processing, at operation 420, first computing device 102 may determine whether a gesture made using second computing device 108, captured in the one or more captured digital images, matches a gesture from library 264. In other embodiments, first computing device 102 may detect a gesture made using second computing device 108 in other ways. For example, first computing device 102 may receive other data indicative of a gesture made using second computing device 108, such as a sequence of movements made using second computing device 108, and may match that data to a gesture in library 264.
A variety of executable procedures aside from the examples already described are possible using disclosed techniques. Moreover, various information determined at various operations may be used to facilitate all or portions of an executable procedure. For example, referring again to
In various embodiments, after a transaction between first computing device 102 and second computing device 108 is completed, first computing device 102, second computing device 108 and/or a back end server may establish a context for future transactions between first computing device 102 and second computing device 108. For example, first computing device 102 may store identification and/or authentication information associated with second computing device 108, and vice versa. That way, first computing device 102 may be able to connect more easily to second computing device 108 in the future, e.g., for engagement of transactions of the same or similar type. For example, first computing device 102 and second computing device 108 may be configured for “pairing.” That way, when they are later suitably located (e.g., both within a geofence), they may, e.g., automatically or in response to at least some user intervention, establish lines of communication with each other to facilitate transactions. Method 400 may then end.
At operation 504, the back end server may selectively operate an executable procedure of a plurality of executable procedures. For example, in various embodiments, the back end server may cross check a generic gesture received from first computing device 102 against a library of predefined generic gestures. Then, based on a context of first computing device 102 and/or second computing device 108, the back end server may determine what type of transaction is desired by first computing device 102, perform authentication of first computing device 102, and so forth.
In various embodiments, at operation 506, the back end server may generate and/or transmit, e.g., to second computing device 108, an indication that first computing device 102 desires to enter a particular transaction (e.g., determined based on the detected gesture and context of first computing device 102/first user 104). During this operation or at another time, the back end server may also send other information necessary to enter the transaction to second computing device 108.
At operation 508, the back end server may receive, e.g., from first computing device 102 and/or second computing device 108, information to enable second computing device 108 to enter into the transaction. For example, second user 110 may move second computing device 108 in a gesture (e.g., which may be detected by second computing device 108 or observed by first computing device 102) to indicate that second user 110 is ready to conduct a transaction with first computing device 102. In some embodiments, second computing device 108 may additionally or alternatively provide a context of second computing device 108, security information (e.g., credentials of second user 110), and so forth.
At operation 510, based on various information received from first computing device 102 and/or information received from second computing device 108, the back end server may selectively facilitate the transaction. For example, if credentials received from either first computing device 102 or second computing device 108 are invalid, or if either computing device indicates that it is unable to enter into the transaction, then the back end server may deny the transaction. But if information received from both parties indicates a readiness to enter the transaction from both sides, then the back end server may facilitate the transaction. In various embodiments, at operation 512, the back end server may establish (e.g., store) a context for future transactions of the same type or different types between first computing device 102 and second computing device 108.
System control logic 608 for one embodiment may include any suitable interface controllers to provide for any suitable interface to at least one of the processor(s) 604 and/or to any suitable device or component in communication with system control logic 608.
System control logic 608 for one embodiment may include one or more memory controller(s) to provide an interface to system memory 612. System memory 612 may be used to load and store data and/or instructions, for example, for computing device 600. In one embodiment, system memory 612 may include any suitable volatile memory, such as suitable dynamic random access memory (“DRAM”), for example.
System control logic 608, in one embodiment, may include one or more input/output (“I/O”) controller(s) to provide an interface to NVM/storage 816 and communications interface(s) 620.
NVM/storage 616 may be used to store data and/or instructions, for example. NVM/storage 616 may include any suitable non-volatile memory, such as flash memory, for example, and/or may include any suitable non-volatile storage device(s), such as one or more hard disk drive(s) (“HDD(s)”), one or more solid-state drive(s), one or more compact disc (“CD”) drive(s), and/or one or more digital versatile disc (“DVD”) drive(s), for example.
The NVM/storage 616 may include a storage resource physically part of a device on which the computing device 600 is installed or it may be accessible by, but not necessarily a part of, the device. For example, the NVM/storage 616 may be accessed over a network via the communications interface(s) 620.
System memory 612 and NVM/storage 616 may include, in particular, temporal and persistent copies of selective operation logic 230. The selective operation logic 230 may include instructions that when executed by at least one of the processor(s) 604 result in the computing device 600 practicing one or more of the operations described above for method 400 and/or 500. In some embodiments, the selective operation logic 230 may additionally/alternatively be located in the system control logic 608.
Communications interface(s) 620 may provide an interface for computing device 600 to communicate over one or more network(s) and/or with any other suitable device. Communications interface(s) 620 may include any suitable hardware and/or firmware, such as a network adapter, one or more antennas, a wireless interface, and so forth. In various embodiments, communication interface(s) 620 may include an interface for computing device 600 to use NFC, Wifi Direct, optical communications (e.g., barcodes), BlueTooth or other similar technologies to communicate directly (e.g., without an intermediary) with another device.
For one embodiment, at least one of the processor(s) 604 may be packaged together with system control logic 608 and/or selective operation logic 230 (in whole or in part). For one embodiment, at least one of the processor(s) 604 may be packaged together with system control logic 608 and/or selective operation logic 230 (in whole or in part) to form a System in Package (“SiP”). For one embodiment, at least one of the processor(s) 804 may be integrated on the same die with system control logic 608 and/or selective operation logic 230 (in whole or in part). For one embodiment, at least one of the processor(s) 604 may be integrated on the same die with system control logic 608 and/or selective operation logic 230 (in whole or in part) to form a System on Chip (“SoC”).
In various implementations, computing device 600 may be a laptop, a netbook, a notebook, an ultrabook, a smart phone, a computing tablet, a personal digital assistant (“PDA”), an ultra mobile PC, a mobile phone, a desktop computer, a server, a printer, a scanner, a monitor, a set-top box, an entertainment control unit (e.g., a gaming console), a digital camera, a portable music player, or a digital video recorder. In further implementations, the computing device 600 may be any other electronic device that processes data.
Computer-readable media (including non-transitory computer-readable media), methods, systems and devices for performing the above-described techniques are illustrative examples of embodiments disclosed herein. Additionally, other devices in the above-described interactions may be configured to perform various disclosed techniques.
Although certain embodiments have been illustrated and described herein for purposes of description, a wide variety of alternate and/or equivalent embodiments or implementations calculated to achieve the same purposes may be substituted for the embodiments shown and described without departing from the scope of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that embodiments described herein be limited only by the claims.
Where the disclosure recites “a” or “a first” element or the equivalent thereof, such disclosure includes one or more such elements, neither requiring nor excluding two or more such elements. Further, ordinal indicators (e.g., first, second or third) for identified elements are used to distinguish between the elements, and do not indicate or imply a required or limited number of such elements, nor do they indicate a particular position or order of such elements unless otherwise specifically stated.