Various embodiments relate generally to electrical and electronic hardware, computer software, human-computing interfaces, wired and wireless network communications, telecommunications, data processing, wearable devices, and computing devices. More specifically, disclosed are techniques for positioning media devices or other devices using motion data.
Audio effects, such as surround sound and two-dimensional (2D) and three-dimensional (3D) spatial audio, are becoming increasingly popular. To provide audio effects, different audio channels may be presented at different loudspeakers as a function of the locations or orientations of the loudspeakers. The loudspeakers may be coupled to one or more media devices or speaker boxes. Conventionally, media devices are manually positioned such that an appropriate audio channel may be presented at each media device to provide a desired audio effect. For example, a first audio channel may be configured to be presented from a front right position, a second audio channel may be configured to be presented from a front left position, and a third audio channel may be configured to be presented from a back center position. A user may manually position one media device to be in the front right, one to be in the front left, and one to be in the back center. As another example, a user may position a plurality of media devices in a room, and may manually enter the position of each media device using a user interface. An audio channel may be provided to each media device based on the positions entered by the user.
Thus, what is needed is a solution for positioning devices without the limitations of conventional techniques.
Various embodiments or examples (“examples”) are disclosed in the following detailed description and the accompanying drawings:
Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links. In general, operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
A detailed description of one or more examples is provided below along with accompanying figures. The detailed description is provided in connection with such examples, but is not limited to any particular example. The scope is limited only by the claims and numerous alternatives, modifications, and equivalents are encompassed. Numerous specific details are set forth in the following description in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims without some or all of these specific details. For clarity, technical material that is known in the technical fields related to the examples has not been described in detail to avoid unnecessarily obscuring the description.
Position manager 110 may be configured to determine a position of device 101 relative to device 102 based on motion data 124 based on displacements d1 and d2. The position may include a relative distance, direction, orientation, and the like. For example, device 101 may be moved horizontally to the left with a displacement d1 and device 102 may be moved horizontally to the right with a displacement d2. Then a relative position of device 101 may be a distance of d1+d2 to the left of device 102. The position may include information with respect to a two-dimensional (2D) or three-dimensional (3D) space. For example, position manager 110 may determine the position of device 101 in x and y coordinates, or x, y, and z coordinates, or polar coordinates, or others. Position manager 110 may determine the position of device 101 in terms of a displacement magnitude and a displacement direction. The magnitude may be provided as a distance, and the displacement direction may be provided as an angle. Position manager 110 may determine an orientation of device 101 relative to device 102. The orientation may be provided in terms of one or more angles from a reference. For example, devices 101 and 102 may have a number of sides, such as front, back, left, right, top, and bottom sides. Different interfaces, speakers, buttons, and the like, may be placed on each side. For example, if each respective side of device 101 is facing substantially the same direction as each respective side of device 102 (e.g., the front of device 101 is facing the same direction as the front of device 102, the top of device 101 is facing the same direction as the top of device 102, the right of device 101 is facing the same direction as the right of device 102, etc.), then devices 101-102 may have the same orientation, and a relative rotation angle of device 101 relative to device 102 may be zero (0) degrees.
Position manager 101 may be implemented in device 101, device 102, a server, or another device, or be distributed over any combination of the above devices. Position manager 110 may perform or initiate one or more operations on device 101, device 102, and/or other devices (not shown) as a function of the position of device 101. Position manager 110 may modify functionalities of devices 101-102, access or transmit data to and from devices 101-102, select audio channels or other content to be presented by devices 101-102, terminate communications or transmission of data between devices 101-102, or perform other functions based on the position of device 101. In some examples, devices 101-102 may be configured to provide an audio effect such as surround sound, 2D or 3D spatial audio, and the like. Surround sound is a technique that may be used to enrich the sound experience of a user by presenting multiple audio channels from multiple speakers. 2D or 3D spatial audio may be a sound effect produced by the use multiple speakers to virtually place sound sources in 2D or 3D space, including behind, above, or below the user, independent of the real placement of the multiple speakers. In some examples, at least two transducers operating as loudspeakers can generate acoustic signals that can form an impression or a perception at a listener's ears that sounds are coming from audio sources disposed anywhere in a space (e.g., 2D or 3D space) rather than just from the positions of the loudspeakers. In presenting special audio effects, different audio channels may be mapped to different speakers. One or more loudspeakers may be coupled to each of device 101 and device 102. Each loudspeaker may be configured to present or play an audio channel to provide an audio effect. Position manager 110 may determine an audio signal to be generated at a speaker coupled to device 101 as a function of the position of device 101 relative to device 102. Position manager 110 may cause presentation of the audio signal at the speaker coupled to device 101. Still, other functionalities or operations may be performed.
For example, as shown, each device 103-104 may have two speakers on the front side, one on the left and another on the right. In some examples, devices 103-104 may be configured to provide an audio effect for a “sweet spot,” or an optimal location, which is normal to a front side 105-106. Where devices 103-104 are used together, the sweet spot or region may be an area near an intersection 140 of the two normal 141-142. In one example, speakers 107a and 107b may present a left channel and speakers 108a and 108b may present a right channel, which may present an audio effect. As described above, the audio effect may include surround sound, 2D or 3D audio, or others. The audio effect may be best heard or experienced at the sweet spot 140. In one example, each speaker 107a-b, 108a-b may present different channels, which may together provide an audio effect to be heard at sweet spot 140. As the position of device 103 relative to device 104 changes, the location of sweet spot 140 also changes. For example, if device 103 moves in the −x direction and distance d is increased, then sweet spot 140 may move in the −y direction. As another example, as angle of orientation a changes, a location of sweet spot 140 may change. A position manager may determine a position of device 103 relative to device 104 based on motion data. A location of sweet spot 140 may be determined based on the relative position of device 103. In some examples, devices 103-104 may be moved or rotated in the z plane. Additional angles of direction and angles of orientation may be determined by a position manager. Further, in some examples, devices 103-104 may be configured to steer audio signals such that the sweet spot need not necessarily be normal to front sides 105-106. For example, each device 103-104 may have an array or matrix of speakers that may be used to steer audio signals. Based on a relative position of device 103, the array or matrix of speakers may be configured to steer audio signals to a number of sweet spots. Still, other implementations may be used.
Motion and bump determinator 211 may be configured to determine a motion or change in acceleration to detect an initiation of a process (e.g., via a bump), a termination of a process, and a transit duration. A bump may be associated with a change in acceleration, such as, a device contacting, tapping, knocking, hitting, or colliding with another object or surface, such as another device. A user may bump device 201 against device 202. A bump may serve as an initiating motion, or a trigger or command signal to begin a process of determining a position of device 201. Another initiating motion, or other command signals may be used, such as another motion or gesture, a button coupled to device 201, an entry into user interface 221, and the like. Further, motion and bump determinator 211 may also be configured to determine a terminating motion indicating an end point to be used in determining a position of device 201. The terminating motion may be associated with a user putting down device 201, a user stopping the movement of device 201, and the like. Other command signals may be used, such as another motion or gesture, a button, user interface 221, and the like, as mentioned above. Motion and bump determinator 211 may store one or more templates or conditions associated with a bump, a terminating motion, or other motions used to indicate start and end points for determining a position of device 201. For example, a condition associated with a bump may include a sudden change in acceleration, in terms of magnitude, direction, and/or other parameter. A condition associated with a bump may include a threshold, and the change in motion data must be greater than the threshold in order for the condition to be met. For example, a condition associated with a terminating position may include a sudden change in acceleration, an acceleration indicating that device 201 has been moved downwards (e.g., being put down or placed by a user), and the like. Motion matcher 211 may compare motion data from motion sensor 224 to one or more templates or conditions to determine a match. Motion sensor 224 may be one or more sensors, and may include an accelerometer, gyroscope, inertial sensor, or other sensor that may be used to detect a motion or motion vector. A motion sensor may determine a motion vector with more than one component or axis, such as a 2- or 3-axis accelerometer. A match may be found if the motion data matches the template or condition within a tolerance. For example, if a bump is determined, then subsequent motion data from motion sensor 224 may be used to determine a displacement of device 201. If a terminating motion is determined, then preceding motion data may be used to determine the displacement, and subsequent motion data may not be used. For example, a plurality of portions of motion data may be captured by motion sensor 224 (see, e.g.,
Pairing facility 212 may be configured to pair devices 201-202. After devices 201-202 are bumped together, or another command signal is given, devices 201-202 may be paired. Pairing of devices 201-202 may be performed using acoustic signals, as described in co-pending U.S. patent application Ser. No. 14/266,697, filed Apr. 30, 2014, entitled “Pairing Devices Using Acoustic Signals,” which is incorporated by reference herein in its entirety for all purposes. In another example, device 202 may capture motion data associated with a bump, and determine a parameter associated with the motion data, such as a time of the bump, an acceleration of the bump, and the like. Device 201 may also capture motion data associated with a bump, and determine a parameter associated with the motion data. Device 202 may transmit a data packet including data associated with the bump and data identifying itself. Device 201 may compare the data associated with the bump received from device 202 with the data associated with the bump generated by device 201. If a match or correlation between bump parameters from device 201 and device 202 is found, device 201 may use the data identifying device 202 to pair with device 202. Still, other methods for pairing devices 201-202 may be used. Pairing may include creating secure communications between the devices. Pairing may include creating an ad hoc network or a connection between devices, whereby the devices may transmit and receive data to and from each other. Data may be exchanged using a number of wireless communications protocols, including Bluetooth, maintained by Bluetooth Special Interest Group (SIG) of Kirkland, Wash., ZigBee, maintained by ZigBee Alliance of San Ramon, Calif., Z-Wave, maintained by Z-Wave Alliance, Wireless USB, maintained by USB Implementers Forum, Inc., and the like. Pairing may include generating or storing a shared key or link key between the devices, which may be used to authenticate the connection or trusted relationship between the paired devices. Authentication may include, for example, encrypting and decrypting data, creating encryption keys, and the like. Once paired, devices 201-202 may share data with each other, including data representing audio signals or audio channels, data representing user settings, control signals, and the like. In other examples, pairing facility 212 may not be used. Devices 201-202 may communicate with an intermediary or a server, which may facilitate communication between devices 201-202. Other forms of communications other than wireless radio signals may be used, such as using acoustic signals, and the like.
Displacement determinator 213 may be configured to determine a displacement of device 201 using motion data sensed by motion sensor 224. The displacement of device 201 may be determined based on the portion of motion data between a portion of motion data matching a bump template and another portion of motion data matching a terminating motion template. A displacement may be a shortest distance from an initial position to a final position of a point or object. It may be a length of a straight line between the initial and final positions. A displacement may be a vector, including a distance and a direction. It may be the same as or different from the distance or path traveled by the object. Motion data may be used to determine a distance and direction that device 201 is moved, from which a displacement may be determined. For example, device 201 may not be moved in a straight line. Device 201 may be moved around a room, and then placed at a final position by a user. A displacement of device 201 may be the distance between the initial position and the final position, irrespective of the path that was taken by device 201. For example, acceleration data may be sampled at a certain frequency. The acceleration data may be processed to determine a final position or displacement (see, e.g.,
Rotation determinator 214 may be configured to determine a rotation of device 201 using motion data sensed by motion sensor 224. A rotation may be in 2D or 3D. A 2D rotation may involve one angle of rotation. A 3D rotation may involve two angles of rotation. Motion sensor 224 may capture motion data indicating a rotation. For example, a gyroscope may be used to calculate orientation and rotation. Accelerometers may also be used to determine a rotation. For example, multiple accelerometers may be placed on multiple locations of device 201, and used to determine a rotation. Based on the rotation, position manager 210 may determine a direction of a front side of device 201, which may be a direction in which sound, an audio effect, or other content may be directed (see, e.g.,
Position determinator 215 may be configured to determine a position of device 201 relative to device 202 based on a displacement of device 201 and a displacement of device 202. Position determinator 215 may also determine a position of device 201 relative to device 202 based on a rotation of device 201 and a rotation of device 202. In some examples, data representing a displacement of device 201 may be determined locally by displacement determinator 213. In some examples, data representing displacement of device 201 may be determined locally at device 201, and transmitted to a position determinator implemented at a server. In some examples, motion data from motion sensor 224 may be transmitted to a server, where a displacement determinator and a position determinator are implemented. In some examples, data representing a displacement of device 202 may be transmitted to position manager 210. The transmission of data may be performed using wired or wireless communications (using, e.g., communications facility 217), acoustic signals (using, e.g., speaker 222 and microphone 223), and the like. Other configurations and networks may also be used. Once displacements of devices 201-202 are received, a position may be determined. A position of device 201 may include information associated with a distance, direction, orientation, or other parameter. For example, the position of device 201 relative to device 202 may be determined using trigonometry. For example, if device 201 is displaced by a=10 cm and 160 degrees from a reference point, and device 202 is displaced by b=5 cm and 20 degrees from the reference point, then the distance between devices 201-202, c, may be determined using the law of cosines, c2=a2+b2−2ab cos (160−20), and c=12 cm. The direction of device 201 relative to device 202 may also be determined. As another example, device 201 may be rotated by 10 degrees, and device 202 may be rotated by −10 degrees, then the orientation of device 201 relative to device 202 may be 20 degrees. Still, other methods for determining a position may be used. For example, a GPS may be used to determine the longitudinal and latitudinal coordinates of devices 201-202.
Audio signal determinator 216 may be configured to configure, modify, adjust, generate, or determine an audio signal to be presented at device 201 and/or device 202 as a function of the position of device 201. For example, an audio effect may be presented by devices 201-202. Audio signal determinator 216 may determine the audio signals or audio channels to be presented at devices 201-202 such that the audio effect is presented. More than one loudspeaker may be coupled to each of devices 201-202. Audio signal determinator 216 may determine the audio signal to be presented at a plurality of loudspeakers. For example, a surround sound audio content may have a left channel and a right channel. Audio signal determinator 216 may determine whether device 201 should present the left channel or the right channel based on whether it is to the right or to the left of device 202. As another example, audio signal determinator 216 may modify or adjust the audio signals to be presented based on the positions of devices 201-202 using 2D or 3D audio techniques or algorithms.
As another example, audio signal determinator 216 may terminate a transmission of data between devices 201-202 based on the position of device 201 relative to device 202. For example, a distance between device 201 and device 202 may exceed a threshold, and position manager 210 may determine that audio signals should be provided on device 201 but not on device 202. Audio signal determinator 216 may transmit a control signal to stop the presentation of audio at device 202, may stop transmission of data representing an audio signal to device 202, and the like. In one example, based a position of device 201 relative to device 202, audio signal determinator 216 may determine that device 201 should present a left audio channel and device 202 should present a right audio channel, which may together provide an audio effect. Each device 201-202 may have a left speaker and a right speaker. Device 201 may present the left audio channel at both its left and right speakers, while device 202 may present the right audio channel at both its left and right speakers. In one example, device 201 may then be moved further away from device 202, and the distance between devices 201-202 may exceed a threshold. In another example, device 202 may be rotated, and the orientation angle between devices 201-202 may exceed a threshold. Audio signal determinator 216 may determine that device 202 should stop presenting the right audio channel, and may determine that the left speaker of device 201 should present the left audio channel and the right speaker of device 201 should present the right audio channel. Still other operations may be performed by audio signal determinator 216 and position manager 210 as a function of the position of device 201.
User interface 221 may be configured to exchange data between device 201 and a user. User interface 221 may include one or more input-and-output devices, such as a keyboard, mouse, audio input (e.g., speech-to-text device), display (e.g., LED, LCD, or other), monitor, cursor, touch-sensitive display or screen, and the like. User interface 221 may be used to enter a user command to initiate a process of determining a position of device 201. User interface 221 may be used to create or modify a bump template, a terminating motion template, or other templates to be used to determine command signals or gestures. User interface 221 may be used to create or modify operations that may be performed by audio signal determinator 216, or other operations performed by position manager 210. Still, user interface 221 may be used for other purposes.
Speaker 222 may include one or more transducers or loudspeakers. Speaker 222 may be configured to generate audio signal as directed by audio signal determinator 216. Speaker 222 may also generate acoustic signals to transmit data to another device. The acoustic signal may include a vibration, sound, ultrasound, infrasound, and the like. For example, Morse code may be used to encode acoustic signals and transmit data. Other examples for encoding data on an acoustic signal are described at co-pending U.S. patent application Ser. No. 14/266,697, filed Apr. 30, 2014, entitled “Pairing Devices Using Acoustic Signals.”
Microphone 223 may include one or more transducers or microphones. Microphone 223 may be used to receive voice commands from a user. Microphone 223 may be used to determine an ambient audio signal. Microphone 223 may be used to receive acoustic signals encoded with data, as described above.
Sensor 225 may include one or more sensors and may include a variety of sensors, such as a location sensor (e.g., a GPS receiver or other location sensor), a thermometer, an altimeter, a light sensor, a proximity sensor (e.g., a sensor that may detect a strength of a data signal), an ultrasonic sensor, and the like. Sensor 225 may be used in lieu of or in conjunction of motion sensor 224 to gather data, which may be used to determine a displacement or rotation of device 201, to determine a bump, terminating motion, or other gesture or command associated with device 201, and the like. Sensor 225 may be used to determine a distance between device 201 and a user, or a distance between device 201 and a wearable device of a user, which may be used to determine an operation or functionality of device 201 (see, e.g.,
As described above, device A 301b may be moved in the −x direction (e.g., relative to device 302). In one example, device A 301b may first experience an acceleration to reach a certain speed in the −x direction. Once that speed is reached, acceleration may become zero. Device A 301b may have no change in the vertical height during this time. As shown, for example, the beginning of the portion of x-axis motion data 332 may indicate a negative acceleration, corresponding to an increase in acceleration in the −x direction. Portion of x-axis motion data 332 may become zero as device A 301b attains a constant speed in the −x direction. Portion of z-axis motion data 342 may be constant, which may indicate that device A 301b is not being moved in the z direction. Motion data 342 may reflect a gravitational force experienced by device A 301b. Still, device A 301b may be moved in different paths, including in the y-axis, which may result in different portions of motion data in the different axes. In some examples, device A 301b may be rotated, which may also result in different motion data from different motion sensors in different axes. Portions of motion data 332 and 342 (and in some examples other portions of motion data) may be used to determine a displacement, rotation, or other parameter associated with a position of device A 301b, as is described below with respect to
In some examples, device A 301c may be placed on surface 352, which may indicate an end point for the movement of device A 301. As shown, for example, as the movement of device A 301c is slowed down in the −x direction, portion of x-axis motion data 333 may indicate a gradual increase. As device A 301c comes to a complete stop on surface 352, portion of motion data 333 may decrease to zero. For example, as device A 301c is moved downward to be placed on surface 352, portion of z-axis motion data 343 may indicate an increase. As device A 301c hits surface 352, portion of motion data 343 may indicate a decrease. As device A 301c comes to a rest, portion of motion data 343 may reflect only the gravitational force experienced by device A 301c. Portions of motion data 333 and 343 may be compared with one or more templates or conditions associated with a terminating motion. As described above, other data types may also be included in the templates or conditions. For example, an ultrasonic signal may be used to detect the proximity of surface 352 to device A 301c. Once a terminating motion is found, a position manager may determine a displacement and/or rotation of device A 301 using the preceding portion of motion data. The portion of motion data (e.g., portions 332 and 342) between a portion of motion data that matches a bump template (e.g., portions 331 and 341) and another portion of motion data that matches a terminating motion template (e.g., portions 333 and 343) may be used to determine a displacement, rotation, or other parameter associated with a position of device A 301. As described above, other gestures or command signals may be used to indicate an end point of the movement of device A 301.
In some examples, a pairing of device A 301 with device B 302 may occur in parallel with a determining of a position of device A 301. For example, an initiating motion (e.g., a bump) may initiate or prompt a process of pairing 361. While the initiating motion may indicate a beginning of the portion of motion data to be used for determining a displacement, the initiating motion may also being a process of pairing 361. The process of pairing 361 may include device A 301 identifying device B 302 (e.g., using an address, name, other identifier, etc.), device A 301 performing handshake procedures with device B 302 (e.g., exchanging keys, nonces, random numbers, etc.), device A 301 generating a shared key or link key that may be used to authenticate a pairing or trusted relationship with device B 302, encrypting data that is exchanged between device A 301 and device B 302, and the like. Pairing of devices is also described in co-pending U.S. patent application Ser. No. 14/266,697, filed Apr. 30, 2014, entitled “Pairing Devices Using Acoustic Signals,” which is incorporated by reference herein in its entirety for all purposes. After being paired, device A 301 and device B 302 may remain paired, as indicated by 362. While device A 301 and device B 302 remain paired, device A 301 and device B 302 may exchange data (e.g., data representing an audio signal or audio channel, data representing user settings, etc.), control signals, and the like. At a later point in time, device A 301 and device B 302 may become disconnected, as indicated by 363. The disconnection may be triggered when device A 301 is moved away from device B 302, and the distance of device A 301 from device B 302 exceeds a threshold. The disconnection may also be triggered by another command signal or prompt (e.g., a button press, data signals becoming out of range, etc.).
As shown,
After the bump, device C may be moved from its initial position 503a to its end position 503b. As described above, a position of device C 503 relative to device A 501 may be determined using motion data and/or other data. A first portion of motion data may match a bump template, and a third portion of motion data may match a terminating motion template. A second portion of motion data, between the first portion and the third portion, may be used to determine a displacement of device C from position 503a to position 503b, including magnitude d and an angle of direction (not shown). The second portion of motion data may also be used to determine a rotation angle R. Based on the displacement and rotation of device A from position 501a to position 501b, and the displacement and rotation of device C from position 503a to position 503b, the relative positions of device A 501 and device C 503 may be determined. Finally, using the relative positions of device A 501 and device 502 B, the position of device C 503b relative to device B 502 may be determined. Still, other processes or methods may be used for positioning more than two devices using motion data.
In some examples, a distance d2 between device A 601 and wearable device 632 may be determined. In some examples, distance d2 may be determined using a proximity sensor coupled to device A 601. Distance d2 may be determined based on a received signal strength indicator (RSSI), or a signal strength of a radio signal transmitted from wearable device 632. In some examples, distance d2 may be determined using an ultrasonic sensor coupled to device A 601. Still, other methods of determining distance d2 may be used. A position manager may determine that distance d1 has exceeded a first threshold, and that device A 601 and device B 602 should be disconnected. Position manager may determine that presentation of an audio signal should be terminated at one of device A 601 and device B 602. Position manager may further determine that distance d2 is less than a second threshold. In some examples, position manager may determine that distance d2 is less than the distance between device B 602 and wearable device 632. Based on distance d2, position manager may terminate presentation of an audio signal at device B 602, and may continue presentation of an audio signal at device A 601. In some examples, the audio signal presented at device A 601 may be changed or modified after termination of the presentation of an audio signal at device B 602. For example, before the disconnection, device A 601 may present a first audio channel configured to produce an audio effect together with a second audio channel being presented at device B 602. After the disconnection, device A 601 may present a different audio signal, which may be configured to provide the audio effect, or a different audio experience, without the interaction with device B 602. For example, while device A 601 and device B 602 may each present a left channel and a right channel prior to movement, after device A 601 is moved above a threshold distance away from device B 602, device B 602 may stop presenting an audio signal, while device A 601 may present both the left channel and the right channel to present an audio effect.
In some examples, a distance d3 between device A 601 and device C 603 may be determined. As described above, in some examples, the relative positions of device A 601, device B 602, and device C 603 may be determined by a position manager (using the processes and techniques described above). As user 631 moves device A 601 away from device B 602, he may move device A 601 closer to device C 603. Distance d3 may be determined based on motion data received from a motion sensor coupled to device A 601. If distance d3 falls below a threshold, a position manager may connect device A 601 and device C 603, or initiate an interaction between device A 601 and device C 603. In some cases, the position manager may cause presentation of audio signals at device A 601 and device C 603, which may together present an audio effect such as surround sound, 3D audio, or other. The audio signals may be selected or determined as a function of the position of device A 601 relative to device C 603. The position manager may cause transmission of other control or data signals between device A 601 and device C 603. Still, other interactions may be performed based on the movement of device A 601, device B 602, or device C 603, as detected by motion data and/or other types of data. The position manager may be implemented on device A 601, device B 602, device C 603, or a remote device or server. The position manager may also be distributed across device A 601, device B 602, device C 603, and/or a remote device or server.
According to some examples, computing platform 810 performs specific operations by processor 819 executing one or more sequences of one or more instructions stored in system memory 820, and computing platform 810 can be implemented in a client-server arrangement, peer-to-peer arrangement, or as any mobile computing device, including smart phones and the like. Such instructions or data may be read into system memory 820 from another computer readable medium, such as storage device 818. In some examples, hard-wired circuitry may be used in place of or in combination with software instructions for implementation. Instructions may be embedded in software or firmware. The term “computer readable medium” refers to any tangible medium that participates in providing instructions to processor 819 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks and the like. Volatile media includes dynamic memory, such as system memory 820.
Common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Instructions may further be transmitted or received using a transmission medium. The term “transmission medium” may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 801 for transmitting a computer data signal.
In some examples, execution of the sequences of instructions may be performed by computing platform 810. According to some examples, computing platform 810 can be coupled by communication link 823 (e.g., a wired network, such as LAN, PSTN, or any wireless network) to any other processor to perform the sequence of instructions in coordination with (or asynchronous to) one another. Computing platform 810 may transmit and receive messages, data, and instructions, including program code (e.g., application code) through communication link 823 and communication interface 817. Received program code may be executed by processor 819 as it is received, and/or stored in memory 820 or other non-volatile storage for later execution.
In the example shown, system memory 820 can include various modules that include executable instructions to implement functionalities described herein. In the example shown, system memory 820 includes a motion and bump determination module 811, a pairing module 812, a displacement determination module 813, a rotation determination module 814, a position determination module 815, an audio signal determination module 815, and an audio signal determination module 816.
Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the above-described inventive techniques are not limited to the details provided. There are many alternative ways of implementing the above-described invention techniques. The disclosed examples are illustrative and not restrictive.
This application is related to co-pending U.S. patent application Ser. No. 14/266,697, filed Apr. 30, 2014, entitled “Pairing Devices Using Acoustic Signals,” which is incorporated by reference herein in its entirety for all purposes.