Pairing a computing device to a user

Information

  • Patent Grant
  • 9342139
  • Patent Number
    9,342,139
  • Date Filed
    Monday, December 19, 2011
    12 years ago
  • Date Issued
    Tuesday, May 17, 2016
    8 years ago
Abstract
A method for automatically pairing an input device to a user is provided herein. According to one embodiment, the method includes receiving an input from an unpaired input device within an observed scene, and calculating a position of the unpaired input device upon receiving the input. The method further includes detecting one or more users within the observed scene via a capture device, creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, and assigning one detected user on the candidate list to the unpaired input device to initiate pairing.
Description
BACKGROUND

Computing devices are becoming increasingly more interactive. Further, various different computing devices may be communicatively linked so as to provide an enhanced computing experience for each user. For example, a user may provide input via a wireless device to control aspects of a gaming computing device. Such a wireless device may be utilized by various different users, and such users may desire to identify themselves upon taking control of the wireless device. One prior approach prompts a user to enter a user name and/or passphrase to gain access to the gaming computing device. However, such an approach interrupts gameplay, which detracts from the gaming experience and causes user frustration.


SUMMARY

A method for automatically pairing an input device to a user is provided herein. According to one embodiment, the method includes receiving an input from an unpaired input device within an observed scene, and calculating a position of the unpaired input device upon receiving the input. The method further includes detecting one or more users within the observed scene via a capture device, creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, and assigning one detected user on the candidate list to the unpaired input device to initiate pairing.


This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically shows an example computing environment including an example mobile device according to an embodiment of the present disclosure.



FIG. 2 is a flowchart that illustrates an example method for pairing the mobile device of FIG. 1 to a user according to an embodiment of the present disclosure.



FIG. 3 is a flowchart that illustrates an example method for terminating the pairing of the mobile device of FIG. 1 to the user according to an embodiment of the present disclosure.



FIG. 4 is a flowchart that illustrates an example method for transferring the pairing of the mobile device of FIG. 1 to another user according to an embodiment of the present disclosure.



FIG. 5 is a schematic view of a computing system that may be used as the gaming system of FIG. 1.





DETAILED DESCRIPTION

Aspects of this disclosure will now be described by example and with reference to the illustrated embodiments listed above.



FIG. 1 shows an example 3D interaction space 100 in which user 10 is located. FIG. 1 also shows gaming system 12 which may enable user 10 to interact with a video game. Gaming system 12 may be used to play a variety of different games, play one or more different media types, and/or control or manipulate non-game applications and/or operating systems. Gaming system 12 may include gaming console 14 and display device 16, which may be used to present game visuals to game players. Gaming system 12 is a type of computing device, the details of which will be discussed with respect to FIG. 6.


Turning back to FIG. 1, 3D interaction space 100 may also include a capture device 18 such as a camera, which may be coupled to gaming system 12. Capture device 18, for example, may be a depth camera used to observe 3D interaction space 100 by capturing images. Thus, 3D interaction space may also be referred to herein as an observed scene. As such, capture device 18 may be used to capture images of user 10 which may be used to identify user 10. For example, user 10 may be identified by a physical attribute unique to user 10 by employing three-dimensional imaging, facial recognition, biometrics, or another user identifying technology. Further, in some embodiments, a presence of user 10 may be detected without determining an identity of the user.


As shown, user 10 is holding an input device 20 with right hand 22. User 10 may interact with input device 20 to control aspects of gaming system 12. In the depicted example, input device 20 is a magic wand which may be used to cast spells in a wizard game, for example. Because a position of input device 20 is within a vicinity of right hand 22, input device 20 may be automatically paired to user 10. For example, capture device 18 and/or another sensor of computing system 12 may determine a position of input device 20 and a position of user 10. Further, computing system 12 may determine that user 10 is a nearest user to input device 20. As described in more detail below, the nearest user to an input device may be automatically paired to the input device.



FIG. 1 further shows a second user 24 and a third user 26. The second and third users may also be present within 3D interaction space 100, thus, the second and third users may also be observed by capture device 18. As shown, second and third users are farther from input device 20 than user 10. Therefore, the second user and the third user may be at a position that is not within the vicinity of input device 20. As such, neither user 24 nor user 26 may be paired to input device 20.



FIG. 1 further shows various unpaired computing devices, such as controller 28 and mobile computing device 30. Such devices may be unpaired because the devices are idle. In other words, neither user 10, user 24, nor user 26 is interacting with controller 28 or mobile computing device 30; therefore, computing system 12 is not receiving input from either device. As such, controller 28 and mobile device 30 may be in an unpaired state. However, such devices may be automatically paired to a user when computing system 12 receives input from an unpaired device. For example, user 24 may pick up controller 28 and press a button to block the casted spell, and computing system 12 may automatically pair controller 28 to user 24 upon receiving the input of the pressed button.


It will be appreciated that FIG. 1 is provided by way of example, and thus, is not meant to be limiting. Therefore, it is to be understood that a wizard game is provided to illustrate a general concept, and various other games and non-game applications are possible without departing from the scope of this disclosure.


Further, it will be appreciated that a user may interact with the input device in any suitable way. Therefore, it is to be understood that user 10 interacting with the input device using right hand 22 is provided by way of example, and thus, is not meant to be limiting. For example, user 10 may interact with an input device using a left hand. As another example, the input device may be attached to a body portion of user 10, such as a leg, a knee, a shoulder, an elbow, or another body portion. In this way, the user may interact with an input device attached to the user's body, and a position of such an input device may be determined and associated with a particular body portion to which the input device is attached. For example, the computing system may determine that the attached input device and the associated body portion are co-localized, and thus may be moving similarly. As such, the computing system may determine that the user is interacting with the attached input device. Continuing with the above example, user 10 may attach a shield input device to a forearm and such an attached input device may be used to shield a casted spell, for example.



FIG. 2 is a flowchart that illustrates an example method 200 for automatically pairing an input device to a user. Method 200 may be executed by a main computing device (e.g., gaming system 12) communicatively coupled to the input device (e.g., input device 20), for example.


At 202, method 200 includes receiving an input from an unpaired input device within an observed scene. For example, the received input may include a button press, a joystick movement, a D-pad movement, a touch event, and/or a detected motion of the unpaired input device. Further, the unpaired input device may include a motion sensor and/or a tag that may indicate the detected motion of the device. For example, the motion sensor may provide the detected motion input to the main computing device. As another example, a capture device of the main computing device may detect the detected motion of the tag associated with the unpaired input device. It will be appreciated that the unpaired input device emits a wireless signal that is received by the computing main device, and which contains an input device identifier by which the computing device may identify the input device. The input device 20 itself may be a game controller, mobile telephone, tablet computer, or other device capable of communicating with the computing device by a wireless signal. The wireless signal may be in the form of infrared, BLUETOOTH, WiFi, or other suitable wireless signal. Further, the input device identifier may be a GUID, MAC address, or other identifier that is used to distinguish the input device from other input devices in the scene.


At 204, method 200 includes calculating a position of the unpaired input device upon receiving the input. For example, the position may include a distance and an angle of the unpaired input device from the main computing device. In some embodiments, calculating the position may include a beamforming technique to determine the distance and the angle of the unpaired input device with respect to the main computing device. However, it will be appreciate that other techniques are possible without departing from the scope of this disclosure. For example, a two-dimensional positioning technique may be used. As another example, a three-dimensional positioning technique may be used. Further, in some embodiments the unpaired input device may include a tag and the position of the unpaired input device may be determined by locating a position of the tag. For example, the tag may include visible identifying information which may be detected by the main computing device. Thus, it will be appreciated that the tag may be an augmented reality marker that contains an optical code that is captured in an image taken by a camera, such as a visible light camera, which is associated with the main computing device, and the position of the marker in the scene can be determined by the main computing device.


At 206, method 200 includes detecting one or more users within the observed scene via a capture device of the main computing device, wherein the observed scene includes the unpaired input device. For example, capture device 18 may detect one or more users within the observed scene.


At 208, method 200 includes creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired computing device. For example, the candidate list may include user 10, user 24, and user 26 of FIG. 1. Further, the candidate list may include a distance between each user and the unpaired input device for which the input was received. Further, the candidate list may include a right hand distance, a left hand distance, and a geometric center distance of each user as measured with respect to the position of the unpaired input device. Further, such a candidate list may be sorted by proximity to the unpaired input device with respect to each distance.


It will be appreciated that the candidate list may be created in any suitable way. For example, the candidate list may be sorted by distance in ascending or descending order. As another example, one or more detected users may be excluded from the candidate list if such users exceed a threshold distance from the unpaired input device. In this way, those users which are too far away from the unpaired input device to be a conceivable candidate for pairing may be automatically removed from the candidate list to reduce processing time, for example.


At 210, method 200 includes assigning one detected user on the candidate list to the unpaired input device to initiate pairing, the one user being a nearest user to the unpaired computing device. Further, once assigned, the unpaired input device may be considered a paired input device. In some embodiments, the nearest user is a user on the candidate list with a strongest correlation to the unpaired input device. For example, the right hand distance, the left hand distance, and/or the geometric center distance of the nearest user may be the smallest distance from the input device included on the candidate list. In this way, the nearest user may be automatically paired to the unpaired input device. Further, assigning the nearest user to the unpaired input device may include automatically pairing a profile associated with the nearest user to the input device.


At 212, method 200 includes tracking the nearest user and the paired input device. For example, tracking the nearest user and the paired input device may include correlating a motion of the right hand, the left hand, and/or the geometric center of the nearest user to the motion of the paired computing device. For example, a motion of the user's right hand may provide input through interaction with an input device. In the example provided above, user 10 waves magic wand 20 to provide input to gaming system 12. The concurrent motion of the right hand of user 10 and magic wand 20 may be tracked using depth camera 18, for example. Since the motion of the right hand and the motion of the magic wand occur concurrently, the gaming system may determine that user 10 remains paired with magic wand 20.


It will be appreciated that method 200 is provided by way of example, and thus, is not meant to be limiting. Therefore, it is to be understood that method 200 may include additional and/or alternative steps than those illustrated in FIG. 2. For example, in some embodiments, a nearest user may interact with two different devices to provide input to a computing device via the two different devices. In such a scenario, both devices may be paired to the nearest user concurrently.


Further, it is to be understood that method 200 may be performed in any suitable order. Further still, it is to be understood that one or more steps may be omitted from method 200 without departing from the scope of this disclosure.


For example, method 200 may further include identifying each detected user within the observed scene by identifying a physical attribute of each detected user and matching the physical attribute of each detected user to a profile associated with each detected user. As non-limiting examples, the physical attribute may be detected using three-dimensional imaging, facial recognition, biometric based detection and/or other user identifying technologies. In this way, the detected user may be identified and paired to the input device such that the profile of the detected user is associated with the paired input device.



FIG. 3 is a flowchart that illustrates an example method 300 for terminating the pairing of an input device to a user. At 302, method 300 includes paring an input device to a nearest user. For example, method 200 may be used to pair the input device to the nearest user.


At 304, method 300 includes determining if input from the input device is received by a computing device. If the answer to 304 is YES, method 300 proceeds to 306. If the answer to 304 is NO, method 300 proceeds to 308.


At 306, method 300 includes continuing to pair the input device to the nearest user. For example, if the nearest user is interacting with the input device, then the nearest user may continue to be paired to the input device.


At 308, method 300 includes determining if input from the input device is received by the computing device within a threshold period of time. For example, the input device may be temporarily idle. As such, the nearest user may not be actively interacting with the input device at a particular moment in time. For example, the nearest user may interact with the input device to cast a spell, and the input device may be temporarily idle immediately after casting the spell. For example, the nearest user may be waiting for another game player to cast an opposing spell. It will be appreciated that the threshold period of time may be any suitable period of time. Further, the threshold period of time may vary for different settings, different games, and/or different non-game applications. As such, the threshold period of time may be customizable by a user and/or by a game/application developer. If the answer to 308 is NO, method 300 continues to 310. If the answer to 308 is YES, method 300 continues to 312.


At 310, method 300 includes terminating an association of the nearest user to the input device. For example, the input device may be idle because the nearest user set the input device down and/or decided to end a gaming session. Since the user does not interact with the input device within the threshold period of time, the input device may be unpaired from the nearest user, for example. As another example, the nearest user may hand off the input device to another user, and therefore the nearest user may be a former nearest user. Such an example of transferring a paired input device from one user to another user is described in further detail below with respect to FIG. 4.


At 312, method 300 includes comparing a position of the input device to a position of the nearest user. For example, the position of the input device may be compared to a position of a right hand of the nearest user.


At 314, method 300 includes determining if the position of the input device is within a vicinity of the position of the nearest user. For example, the vicinity may be a threshold distance from the input device. As one non-limiting example, the threshold distance may be a twelve inch radius around a perimeter of the paired input device. However, it will be appreciated that other threshold distances are possible without departing from the scope of this disclosure. Further, if the nearest user is within the threshold distance, then the user may be within the vicinity of the input device. If the answer to 314 is NO, method 300 continues to 310 and an association of the nearest user to the input device is terminated. If the answer to 314 is YES, method 300 continues to 316.


At 316, method 300 includes continuing paring the input device to the nearest user. For example, the input device may be idle yet the user may still be holding the input device. Therefore, the nearest user may still be within the vicinity of the input device. Further, the nearest user may be providing periodic input via interaction with the input device; however, the periodic input may be provided within the threshold period of time. Thus, the nearest user may remain paired with the input device.


It will be appreciated that method 300 is provided by way of example, and thus, is not meant to be limiting. Therefore, it is to be understood that method 300 may include additional and/or alternative steps than those illustrated in FIG. 3. Further, it is to be understood that method 300 may be performed in any suitable order. Further still, it is to be understood that one or more steps may be omitted from method 300 without departing from the scope of this disclosure.



FIG. 4 is a flowchart that illustrates an example method 400 for transferring the pairing of an input device to another user. At 402, method 400 includes tracking a first user and a paired input device, wherein the first user is a nearest user.


At 404, method 400 includes determining if a position of the first user is within a vicinity of the paired input device. For example, the vicinity may be a threshold distance from the paired input device, as described above. If the answer to 404 is YES, method 400 continues to 406. If the answer to 404 is NO, method 400 continues to 408.


At 406, method 400 includes continuing to track a position of the first user and the paired input device. As such, the first user may remain within the vicinity of the paired input device. Therefore, the first user may continue to interact with the input device to control aspects of a video game, for example.


At 408, method 400 includes determining if a position of a second user is within the vicinity of the paired input device. If the answer to 408 is NO, method 400 continues to 410. If the answer to 408 is YES, method 400 continues to 412.


At 410, method 400 includes terminating an association of the first user to the paired input device. For example, the first user may ‘hand off’ the paired input device to the second user. Therefore, in such a scenario the first user may be a former nearest user, and thus the first user may be unpaired from the input device.


At 412, method 400 includes transferring the paired input device to the second user. As such, an association between the second user and the paired input device may be formed. Therefore, in such a scenario, the second user may be a current nearest user to the input device. As described above, second user may receive the input device from the first user as a hand off. Further, it will be appreciated that the second user and the paired input device may be tracked, as described above.


It will be appreciated that method 400 is provided by way of example, and thus, is not meant to be limiting. Therefore, it is to be understood that method 400 may include additional and/or alternative steps than those illustrated in FIG. 4. Further, it is to be understood that method 400 may be performed in any suitable order. Further still, it is to be understood that one or more steps may be omitted from method 200 without departing from the scope of this disclosure.


As described above, by pairing an input device to a user based on a position of the input device and a position of the user, the user can be automatically paired to the input device. In this way, the user can avoid having to log into a user account or having to perform other manual inputs to pair a device. Further, by automatically pairing the device to the user in this way, the user may pick up a new device, switch devices with another user, use more than one device, etc. and the one or more devices may be paired to the user automatically in real-time. As such, the gaming experience may be enhanced. For example, using the above example, the user may interact with the wand input device to cast a spell, and the user may switch to interact with the controller to throw a punch or another physical attack. As such, the gaming experience may be enhanced by allowing the user to interact with different input devices which may allow different gaming controls, while switching the pairing of the user between the different input devices automatically. In this way, the gaming experience may be enhanced.


In some embodiments, the above described methods and processes may be tied to a computing system including one or more computers. In particular, the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product.



FIG. 5 schematically shows a non-limiting computing system 70 that may perform one or more of the above described methods and processes. Computing system 70 is shown in simplified form. It is to be understood that virtually any computer architecture may be used without departing from the scope of this disclosure. In different embodiments, computing system 70 may take the form of a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home entertainment computer, network computing device, mobile computing device, mobile communication device, gaming device, etc.


Computing system 70 includes a processor 72 and a memory 74. Computing system 70 may optionally include a display subsystem 76, communication subsystem 78, sensor subsystem 80 and/or other components not shown in FIG. 5. Computing system 70 may also optionally include various user input devices such as keyboards, mice, game controllers, cameras, microphones, and/or touch screens, for example. FIG. 6 shows a game controller 28, magic wand 20, and a mobile computing device 30, which may provide input to computing system 70. As described above, such input devices may be automatically paired to a user depending on a proximity of the user to one or more of the input devices.


Processor 72 may include one or more physical devices configured to execute one or more instructions. For example, the processor may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.


The processor may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the processor may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the processor may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The processor may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the processor may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.


Memory 74 may include one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the processor to implement the herein described methods and processes. When such methods and processes are implemented, the state of memory 74 may be transformed (e.g., to hold different data).


Memory 74 may include removable media and/or built-in devices. Memory 74 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Memory 74 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, processor 72 and memory 74 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.



FIG. 5 also shows an aspect of the memory in the form of removable computer-readable storage media 82, which may be used to store and/or transfer data and/or instructions executable to implement the herein described methods and processes. Removable computer-readable storage media 82 may take the form of CDs, DVDs, HD-DVDs, Blu-Ray Discs, EEPROMs, and/or floppy disks, among others.


It is to be appreciated that memory 74 includes one or more physical, non-transitory devices. In contrast, in some embodiments aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.


The terms “module,” “program,” and “engine” may be used to describe an aspect of computing system 70 that is implemented to perform one or more particular functions. In some cases, such a module, program, or engine may be instantiated via processor 72 executing instructions held by memory 74. It is to be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.


It is to be appreciated that a “service”, as used herein, may be an application program executable across multiple user sessions and available to one or more system components, programs, and/or other services. In some implementations, a service may run on a server responsive to a request from a client.


When included, display subsystem 76 may be used to present a visual representation of data held by memory 74. As the herein described methods and processes change the data held by the memory, and thus transform the state of the memory, the state of display subsystem 76 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 76 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with processor 72 and/or memory 74 in a shared enclosure, or such display devices may be peripheral display devices.


When included, communication subsystem 78 may be configured to communicatively couple computing system 70 with one or more other computing devices. Communication subsystem 78 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As nonlimiting examples, the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing system 70 to send and/or receive messages to and/or from other devices via a network such as the Internet. Further, communication subsystem may be configured for beamforming to locate a position of an input device within an observed scene, as described above.


Sensor subsystem 80 may include one or more sensors configured to sense one or more human subjects, as described above. For example, the sensor subsystem 80 may comprise one or more image sensors, motion sensors such as accelerometers, touch pads, touch screens, and/or any other suitable sensors. Therefore, sensor subsystem 80 may be configured to provide observation information to processor 72, for example. As described above, observation information such as image data, motion sensor data, and/or any other suitable sensor data may be used to perform such tasks as determining a position of each of a plurality of joints of one or more human subjects.


In some embodiments, sensor subsystem 80 may include a depth camera 84 (e.g., depth camera 18 of FIG. 1). Depth camera 84 may include left and right cameras of a stereoscopic vision system, for example. Time-resolved images from both cameras may be registered to each other and combined to yield depth-resolved video.


In other embodiments, depth camera 84 may be a structured light depth camera configured to project a structured infrared illumination comprising numerous, discrete features (e.g., lines or dots). Depth camera 84 may be configured to image the structured illumination reflected from a scene onto which the structured illumination is projected. Based on the spacings between adjacent features in the various regions of the imaged scene, a depth image of the scene may be constructed.


In other embodiments, depth camera 84 may be a time-of-flight camera configured to project a pulsed infrared illumination onto the scene. The depth camera may include two cameras configured to detect the pulsed illumination reflected from the scene. Both cameras may include an electronic shutter synchronized to the pulsed illumination, but the integration times for the cameras may differ, such that a pixel-resolved time-of-flight of the pulsed illumination, from the source to the scene and then to the cameras, is discernable from the relative amounts of light received in corresponding pixels of the two cameras.


In some embodiments, sensor subsystem 80 may include a visible light camera 86. Virtually any type of digital camera technology may be used without departing from the scope of this disclosure. As a non-limiting example, visible light camera 86 may include a charge coupled device image sensor. Further, visible light camera 86 may be configured to detect an input device tag to locate a position of an input device within an observed scene, as described above.


It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.


The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims
  • 1. A method for automatically pairing an input device to a user, the method executed by a computing device communicatively coupled to the input device, the method comprising: receiving an input from an unpaired input device within an observed scene;calculating a position of the unpaired input device upon receiving the input, the position including a distance and an angle relative to the computing device;detecting one or more users within the observed scene via a capture device of the computing device, the observed scene including the unpaired input device;creating a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, the candidate list including a distance between each user and the unpaired input device;assigning a first user from the candidate list to the unpaired input device to initiate pairing, the first user being a nearest user to the unpaired input device, and once assigned, the unpaired input device being a first paired input device;switching the first paired input device from the first user to a second user when the second user is closer to the first paired input device than the first user;switching a second paired input device of the second user from the second user to the first user when the first user is closer to the second paired input device than the second user; andconcurrently pairing both the first paired input device and the second paired input device to the first user when the first user is closer than the second user to both the first paired input device and the second paired input device.
  • 2. The method of claim 1, further comprising identifying each detected user within the observed scene by identifying a physical attribute of each detected user and matching the physical attribute of each detected user to a profile associated with each detected user.
  • 3. The method of claim 2, wherein the physical attribute is detected using three-dimensional imaging, facial recognition, or biometrics.
  • 4. The method of claim 1, wherein the received input includes at least one of a button press, a joystick movement, a D-pad movement, a touch event, and a detected motion.
  • 5. The method of claim 4, wherein the unpaired input device includes a motion sensor to sense the detected motion, the motion sensor providing the detected motion input to the computing device.
  • 6. The method of claim 1, wherein the unpaired input device includes a tag, and wherein the position of the unpaired input device is calculated based upon a detected position of the tag.
  • 7. The method of claim 1, wherein calculating the position of the unpaired input device upon receiving the input includes a beamforming technique to determine the distance and the angle of the unpaired input device from the computing device.
  • 8. The method of claim 1, wherein the candidate list includes a body portion distance, wherein the body portion distance is at least one of a hand distance, a geometric center distance, a leg distance, a knee distance, a shoulder distance, and an elbow distance of each detected user as measured with respect to the position of the unpaired input device.
  • 9. The method of claim 8, wherein the nearest user is a user on the candidate list with a strongest correlation to the unpaired input device, wherein one body portion distance for the nearest user has the strongest correlation to the unpaired input device.
  • 10. The method of claim 1, wherein assigning the nearest user to the unpaired input device includes automatically pairing a profile associated with the nearest user to the unpaired input device.
  • 11. The method of claim 1, further comprising tracking the nearest user and the first paired input device by correlating a motion of a body portion of the nearest user to a motion of the first paired input device, the motion of the nearest user's body portion including the nearest user providing input to the computing device via interaction with the first paired input device.
  • 12. The method of claim 1, further comprising transferring the first paired input device to a second user if a position of the second user has the strongest correlation to the position of the first paired input device, the second user being another detected user on the candidate list.
  • 13. The method of claim 1, further comprising pairing more than one input device to the user.
  • 14. The method of claim 1, further comprising: maintaining nearest user pairing to the first paired input device until a pair terminating or pair transferring event occurs.
  • 15. The method of claim 14, further comprising: terminating an association of the first user to the first paired input device if input from the first paired input device is not received for a threshold period of time, and after the association is terminated.
  • 16. A non-volatile memory holding instructions executable by a processor to: receive an input from an unpaired input device within an observed scene;calculate a position of the unpaired input device upon receiving the input, the position including a distance and an angle from a computing device;detect one or more users within the observed scene via a capture device of the computing device;create a candidate list of the one or more detected users determined to be within a vicinity of the unpaired input device, the candidate list including a distance between each user and the unpaired input device;assign a first user from the candidate list to the unpaired input device to initiate pairing, the first user being a nearest user to the unpaired input device, and once assigned, the unpaired input device being a first paired input device;switching the first paired input device from the first user to a second user when the second user is closer to the first paired input device than the first user;switching a second paired input device of the second user from the second user to the first user when the first user is closer to the second paired input device than the second user; andconcurrently pairing both the first paired input device and the second paired input device to the first user when the first user is closer than the second user to both the first paired input device and the second paired input device.
  • 17. The memory of claim 16, further comprising instructions executable by a processor to identify each detected user within the observed scene by identifying a physical attribute of each detected user and to match the physical attribute of each detected user to a profile associated with each detected user, wherein the physical attribute is detected using three-dimensional imaging, facial recognition, or biometrics.
  • 18. The memory of claim 16, further comprising instructions executable by a processor to track the nearest user and the first paired input device by correlating a motion of a body portion of the nearest user to a motion of the first paired input device, the motion of the nearest user's body portion including the nearest user providing input to the computing device via interaction with the first paired input device.
  • 19. The memory of claim 16, further comprising instructions executable by a processor to terminate an association of the nearest user to the first paired input device if input from the first paired input device is not received for a threshold period of time, and if the association is terminated, returning the first paired input device to an unpaired state.
  • 20. A method for automatically pairing an input device to a user, the method executed by a computing device communicatively coupled to the input device, the method comprising: receiving an input from an unpaired input device within an observed scene;calculating a position of the unpaired input device upon receiving the input, the position including a distance and an angle relative to the computing device;detecting one or more users within the observed scene via a capture device of the computing device;identifying each detected user within the observed scene by identifying a physical attribute of each detected user and matching the physical attribute of each detected user to a profile associated with each detected user;creating a candidate list of the one or more identified users determined to be within a vicinity of the unpaired input device, the candidate list including a right hand distance, a left hand distance, and a geometric center distance between each user and the unpaired input device;maintaining nearest user pairing to the paired input device until a pair terminating or pair transferring event occurs;assigning a first user from the candidate list to the unpaired input device to initiate pairing, the first user being a nearest user to the unpaired input device, the right hand distance, the left hand distance, or the geometric center distance for the nearest user having a strongest correlation to the unpaired input device, and once assigned, the unpaired input device being a first paired input device;tracking the nearest user and the first paired input device by correlating a motion of the nearest user's right hand or left hand to a motion of the first paired input device, the motion of the nearest user's right hand or left hand including the nearest user providing input to the computing device via interaction with the first paired input device;terminating an association of the nearest user to the first paired input device if input from the first paired input device is not received for a threshold period of time, and if the association is terminated;switching the first paired input device from the first user to a second user when the second user is closer to the first paired input device than the first user; andswitching a second paired input device of the second user from the second user to the first user when the first user is closer to the second paired input device than the second user;transferring the first paired input device to the second user if a position of the second user has the strongest correlation to the position of the first paired input device, the second user being another detected user on the candidate list; andconcurrently pairing both the first paired input device and the second paired input device to the first user when the first user is closer than the second user to both the first paired input device and the second paired input device.
US Referenced Citations (192)
Number Name Date Kind
4627620 Yang Dec 1986 A
4630910 Ross et al. Dec 1986 A
4645458 Williams Feb 1987 A
4695953 Blair et al. Sep 1987 A
4702475 Elstein et al. Oct 1987 A
4711543 Blair et al. Dec 1987 A
4751642 Silva et al. Jun 1988 A
4796997 Svetkoff et al. Jan 1989 A
4809065 Harris et al. Feb 1989 A
4817950 Goo Apr 1989 A
4843568 Krueger et al. Jun 1989 A
4893183 Nayar Jan 1990 A
4901362 Terzian Feb 1990 A
4925189 Braeunig May 1990 A
5101444 Wilson et al. Mar 1992 A
5148154 MacKay et al. Sep 1992 A
5184295 Mann Feb 1993 A
5229754 Aoki et al. Jul 1993 A
5229756 Kosugi et al. Jul 1993 A
5239463 Blair et al. Aug 1993 A
5239464 Blair et al. Aug 1993 A
5288078 Capper et al. Feb 1994 A
5295491 Gevins Mar 1994 A
5320538 Baum Jun 1994 A
5347306 Nitta Sep 1994 A
5385519 Hsu et al. Jan 1995 A
5405152 Katanics et al. Apr 1995 A
5417210 Funda et al. May 1995 A
5423554 Davis Jun 1995 A
5454043 Freeman Sep 1995 A
5469740 French et al. Nov 1995 A
5495576 Ritchey Feb 1996 A
5516105 Eisenbrey et al. May 1996 A
5524637 Erickson et al. Jun 1996 A
5534917 MacDougall Jul 1996 A
5563988 Maes et al. Oct 1996 A
5577981 Jarvik Nov 1996 A
5580249 Jacobsen et al. Dec 1996 A
5594469 Freeman et al. Jan 1997 A
5597309 Riess Jan 1997 A
5616078 Oh Apr 1997 A
5617312 Iura et al. Apr 1997 A
5638300 Johnson Jun 1997 A
5641288 Zaenglein Jun 1997 A
5682196 Freeman Oct 1997 A
5682229 Wangler Oct 1997 A
5690582 Ulrich et al. Nov 1997 A
5703367 Hashimoto et al. Dec 1997 A
5704837 Iwasaki et al. Jan 1998 A
5715834 Bergamasco et al. Feb 1998 A
5875108 Hoffberg et al. Feb 1999 A
5877803 Wee et al. Mar 1999 A
5913727 Ahdoot Jun 1999 A
5933125 Fernie Aug 1999 A
5980256 Carmein Nov 1999 A
5989157 Walton Nov 1999 A
5995649 Marugame Nov 1999 A
6005548 Latypov et al. Dec 1999 A
6009210 Kang Dec 1999 A
6054991 Crane et al. Apr 2000 A
6066075 Poulton May 2000 A
6072494 Nguyen Jun 2000 A
6073489 French et al. Jun 2000 A
6077201 Cheng et al. Jun 2000 A
6098458 French et al. Aug 2000 A
6100896 Strohecker et al. Aug 2000 A
6101289 Kellner Aug 2000 A
6128003 Smith et al. Oct 2000 A
6130677 Kunz Oct 2000 A
6141463 Covell et al. Oct 2000 A
6147678 Kumar et al. Nov 2000 A
6152856 Studor et al. Nov 2000 A
6159100 Smith Dec 2000 A
6173066 Peurach et al. Jan 2001 B1
6181343 Lyons Jan 2001 B1
6188777 Darrell et al. Feb 2001 B1
6215890 Matsuo et al. Apr 2001 B1
6215898 Woodfill et al. Apr 2001 B1
6226396 Marugame May 2001 B1
6229913 Nayar et al. May 2001 B1
6256033 Nguyen Jul 2001 B1
6256400 Takata et al. Jul 2001 B1
6283860 Lyons et al. Sep 2001 B1
6289112 Jain et al. Sep 2001 B1
6299308 Voronka et al. Oct 2001 B1
6308565 French et al. Oct 2001 B1
6316934 Amorai-Moriya et al. Nov 2001 B1
6363160 Bradski et al. Mar 2002 B1
6384819 Hunter May 2002 B1
6411744 Edwards Jun 2002 B1
6430997 French et al. Aug 2002 B1
6476834 Doval et al. Nov 2002 B1
6496598 Harman Dec 2002 B1
6503195 Keller et al. Jan 2003 B1
6539931 Trajkovic et al. Apr 2003 B2
6570555 Prevost et al. May 2003 B1
6633294 Rosenthal et al. Oct 2003 B1
6640202 Dietz et al. Oct 2003 B1
6661918 Gordon et al. Dec 2003 B1
6681031 Cohen et al. Jan 2004 B2
6714665 Hanna et al. Mar 2004 B1
6731799 Sun et al. May 2004 B1
6738066 Nguyen May 2004 B1
6765726 French et al. Jul 2004 B2
6788809 Grzeszczuk et al. Sep 2004 B1
6801637 Voronka et al. Oct 2004 B2
6873723 Aucsmith et al. Mar 2005 B1
6876496 French et al. Apr 2005 B2
6937742 Roberts et al. Aug 2005 B2
6950534 Cohen et al. Sep 2005 B2
7003134 Covell et al. Feb 2006 B1
7036094 Cohen et al. Apr 2006 B1
7038855 French et al. May 2006 B2
7039676 Day et al. May 2006 B1
7042440 Pryor et al. May 2006 B2
7050606 Paul et al. May 2006 B2
7058204 Hildreth et al. Jun 2006 B2
7060957 Lange et al. Jun 2006 B2
7113918 Ahmad et al. Sep 2006 B1
7121946 Paul et al. Oct 2006 B2
7170492 Bell Jan 2007 B2
7184048 Hunter Feb 2007 B2
7202898 Braun et al. Apr 2007 B1
7222078 Abelow May 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7259747 Bell Aug 2007 B2
7308112 Fujimura et al. Dec 2007 B2
7317836 Fujimura et al. Jan 2008 B2
7348963 Bell Mar 2008 B2
7359121 French et al. Apr 2008 B2
7367887 Watabe et al. May 2008 B2
7379563 Shamaie May 2008 B2
7379566 Hildreth May 2008 B2
7389591 Jaiswal et al. Jun 2008 B2
7412077 Li et al. Aug 2008 B2
7421093 Hildreth et al. Sep 2008 B2
7430312 Gu Sep 2008 B2
7436496 Kawahito Oct 2008 B2
7450736 Yang et al. Nov 2008 B2
7452275 Kuraishi Nov 2008 B2
7460690 Cohen et al. Dec 2008 B2
7489812 Fox et al. Feb 2009 B2
7536032 Bell May 2009 B2
7555142 Hildreth et al. Jun 2009 B2
7560701 Oggier et al. Jul 2009 B2
7570805 Gu Aug 2009 B2
7574020 Shamaie Aug 2009 B2
7576727 Bell Aug 2009 B2
7590262 Fujimura et al. Sep 2009 B2
7593552 Higaki et al. Sep 2009 B2
7598942 Underkoffler et al. Oct 2009 B2
7607509 Schmiz et al. Oct 2009 B2
7620202 Fujimura et al. Nov 2009 B2
7668340 Cohen et al. Feb 2010 B2
7680298 Roberts et al. Mar 2010 B2
7683954 Ichikawa et al. Mar 2010 B2
7684592 Paul et al. Mar 2010 B2
7701439 Hillis et al. Apr 2010 B2
7702130 Im et al. Apr 2010 B2
7704135 Harrison, Jr. Apr 2010 B2
7710391 Bell et al. May 2010 B2
7729530 Antonov et al. Jun 2010 B2
7746345 Hunter Jun 2010 B2
7760182 Ahmad et al. Jul 2010 B2
7809167 Bell Oct 2010 B2
7834846 Bell Nov 2010 B1
7852262 Namineni et al. Dec 2010 B2
RE42256 Edwards Mar 2011 E
7898522 Hildreth et al. Mar 2011 B2
7976372 Baerlocher et al. Jul 2011 B2
8035612 Bell et al. Oct 2011 B2
8035614 Bell et al. Oct 2011 B2
8035624 Bell et al. Oct 2011 B2
8072470 Marks Dec 2011 B2
20040190776 Higaki et al. Sep 2004 A1
20060202952 Sato et al. Sep 2006 A1
20080026838 Dunstan et al. Jan 2008 A1
20080106517 Kerr et al. May 2008 A1
20090138805 Hildreth May 2009 A1
20090253410 Fitzgerald et al. Oct 2009 A1
20100208942 Porter et al. Aug 2010 A1
20100273130 Chai et al. Oct 2010 A1
20110021271 Ikeda et al. Jan 2011 A1
20110069940 Shimy et al. Mar 2011 A1
20110111846 Ciarrocchi May 2011 A1
20110118032 Zalewski May 2011 A1
20110159959 Mallinson et al. Jun 2011 A1
20110202269 Reventlow Aug 2011 A1
20110291929 Yamada et al. Dec 2011 A1
20120202597 Yee et al. Aug 2012 A1
20120254986 Levien et al. Oct 2012 A1
20130176106 Schultz et al. Jul 2013 A1
Foreign Referenced Citations (7)
Number Date Country
201254344 Jun 2010 CN
0583061 Feb 1994 EP
08044490 Feb 1996 JP
9310708 Jun 1993 WO
9717598 May 1997 WO
9944698 Sep 1999 WO
WO 2012037618 Mar 2012 WO
Non-Patent Literature Citations (30)
Entry
Bonsor, et al., “How Facial Recognition Systems Work”, Retrieved at <<http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/facial-recognition2.htm>>, Retrieved Date: Sep. 1, 2011, p. 1.
“The Kinect Robot”, Retrieved at <<http://www.robots-and-androids.com/kinect-robot.html>>, Retrieved Date: Sep. 1, 2011, p. 1.
Dbrendant, “Project Natal: Targeted Advertising, a Biometric Nightmare”, Retrieved at <<http://eyetrackingupdate.com/2010/05/11/project-natal-targeted-advertising-biometric-nightmare/>>, May 11, 2010, pp. 2.
Kanade et al., “A Stereo Machine for Video-rate Dense Depth Mapping and its New Applications”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1996, pp. 196-202,The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA.
Miyagawa et al., “CCD-Based Range Finding Sensor”, Oct. 1997, pp. 1648-1652, vol. 44 No. 10, IEEE Transactions on Electron Devices.
Rosenhahn et al., “Automatic Human Model Generation”, 2005, pp. 41-48, University of Auckland (CITR), New Zealand.
Aggarwal et al., “Human Motion Analysis: A Review”, IEEE Nonrigid and Articulated Motion Workshop, 1997, University of Texas at Austin, Austin, TX.
Shao et al., “An Open System Architecture for a Multimedia and Multimodal User Interface”, Aug. 24, 1998, Japanese Society for Rehabilitation of Persons with Disabilities (JSRPD), Japan.
Kohler, “Special Topics of Gesture Recognition Applied in Intelligent Home Environments”, In Proceedings of the Gesture Workshop, 1998, pp. 285-296, Germany.
Kohler, “Vision Based Remote Control in Intelligent Home Environments”, University of Erlangen-Nuremberg/Germany, 1996, pp. 147-154, Germany.
Kohler, “Technical Details and Ergonomical Aspects of Gesture Recognition applied in Intelligent Home Environments”, 1997, Germany.
Hasegawa et al., “Human-Scale Haptic Interaction with a Reactive Virtual Human in a Real-Time Physics Simulator”, Jul. 2006, vol. 4, No. 3, Article 6C, ACM Computers in Entertainment, New York, NY.
Qian et al., “A Gesture-Driven Multimodal Interactive Dance System”, Jun. 2004, pp. 1579-1582, IEEE International Conference on Multimedia and Expo (ICME), Taipei, Taiwan.
Zhao, “Dressed Human Modeling, Detection, and Parts Localization”, 2001, The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA.
He, “Generation of Human Body Models”, Apr. 2005, University of Auckland, New Zealand.
Isard et al., “Condensation—Conditional Density Propagation for Visual Tracking”, 1998, pp. 5-28, International Journal of Computer Vision 29(1), Netherlands.
Livingston, “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality”, 1998, University of North Carolina at Chapel Hill, North Carolina, USA.
Wren et al., “Pfinder: Real-Time Tracking of the Human Body”, MIT Media Laboratory Perceptual Computing Section Technical Report No. 353, Jul. 1997, vol. 19, No. 7, pp. 780-785, IEEE Transactions on Pattern Analysis and Machine Intelligence, Caimbridge, MA.
Breen et al., “Interactive Occlusion and Collusion of Real and Virtual Objects in Augmented Reality”, Technical Report ECRC-95-02, 1995, European Computer-Industry Research Center GmbH, Munich, Germany.
Freeman et al., “Television Control by Hand Gestures”, Dec. 1994, Mitsubishi Electric Research Laboratories, TR94-24, Caimbridge, MA.
Hongo et al., “Focus of Attention for Face and Hand Gesture Recognition Using Multiple Cameras”, Mar. 2000, pp. 156-161, 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France.
Pavlovic et al., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, Jul. 1997, pp. 677-695, vol. 19, No. 7, IEEE Transactions on Pattern Analysis and Machine Intelligence.
Azarbayejani et al., “Visually Controlled Graphics”, Jun. 1993, vol. 15, No. 6, IEEE Transactions on Pattern Analysis and Machine Intelligence.
Granieri et al., “Simulating Humans in VR”, The British Computer Society, Oct. 1994, Academic Press.
Brogan et al., “Dynamically Simulated Characters in Virtual Environments”, Sep./Oct. 1998, pp. 2-13, vol. 18, Issue 5, IEEE Computer Graphics and Applications.
Fisher et al., “Virtual Environment Display System”, ACM Workshop on Interactive 3D Graphics, Oct. 1986, Chapel Hill, NC.
“Virtual High Anxiety”, Tech Update, Aug. 1995, pp. 22.
Sheridan et al., “Virtual Reality Check”, Technology Review, Oct. 1993, pp. 22-28, vol. 96, No. 7.
Stevens, “Flights into Virtual Reality Treating Real World Disorders”, The Washington Post, Mar. 27, 1995, Science Psychology, 2 pages.
“Simulation and Training”, 1994, Division Incorporated.
Related Publications (1)
Number Date Country
20130154917 A1 Jun 2013 US