A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the U.S. Patent & Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
The present invention generally relates to game play devices, and more particularly to realistic, motion-controlled game play devices.
Game play devices have been traditionally utilized for physical recreational activity for two or more people at a multitude of venues, such as at home, outside or family entertainment centers. Conventional game play devices are generally further designed for use within a simulated combat environment, where the game play device may be disguised as a handgun, small arm or in a variety of other weaponry formats.
In order to simulate munitions discharge, many conventional game play devices utilize emissions of collimated beams of infrared (IR) light that are directional so that they may be targeted by one participant against one or more of the remaining participants in the game play. Each participant must further employ wearable IR light sensors that may be used to detect IR light that may be incident upon a particular body part (e.g., head or torso) of each participant. Further, the IR light emissions may be encoded with identifying information about the participant and/or the participant's game play device such that when the emitted IR light is detected by the receiving IR light sensor, so is the data encoded within the IR light emission. Accordingly, at the end of game play, each participant may be able to accumulate statistics relevant to the game play, such as the number of “hits” received by the wearable sensors of the participant as compared to the number of “hits” administered by the participant against the remaining participants of the game play.
Conventional game play using conventional game play devices, therefore, may be somewhat mundane since the statistics relating to game play performance are limited mainly to the number of “hits” received by a participant's clothing mounted sensors as compared to the number of “hits” administered by that participant and such statistics are traditionally only gathered once game play has concluded. Accordingly, the results fail to be multi-dimensional and are rather binary in their nature since the “winner” of the game play may simply be declared as that participant receiving the fewest number of “hits” on their wearable sensors in one example, or in another example, the winner of the game play may be that participant who delivered the greatest number of “hits” against his or her opponents.
Further, conventional game play devices include imperfections that inherently divulge their identity as a toy rather than as a genuine article. For example, conventional game play devices utilize poor quality materials and inferior manufacturing techniques that highlight certain abnormalities and imperfections, such as visible input/output (I/O) control components (e.g., on/off switches) and manufacturing flaws (e.g., snap-together sub-components) that divulge interface seams defining each location where one sub-component interfaces with another sub-component of the game play device. Such imperfections detract from the immersive quality of conventional game play devices.
Efforts continue, therefore, to develop realistic game play devices with superior aesthetic qualities and multi-dimensional game play capabilities to better immerse each participant into an authentic game play experience without the need to augment the game play device with additional components external to the game play device.
To overcome limitations in the prior art, and to overcome other limitations that will become apparent upon reading and understanding the present specification, various embodiments of the present invention disclose multi-dimensional game play methods and realistic, high-quality game play devices for use therewith. Game play devices in accordance with the present invention may be made to mimic genuine articles having increased functionality for game play utility as well as for non-game play utility. Game play devices in accordance with the present invention may further provide enhanced user feedback (e.g., audible, visible and tactile feedback) with enhanced motion detection during game play so as to provide a more immersive game play experience.
In accordance with one embodiment of the invention, a game play device comprises an outer structure formed to mimic a magic wand. The outer structure is configured to conceal internal components of the game play device. The internal components of the game play device include a single inertial measurement unit (IMU) configured to detect movements of the game play device and a processor configured to construe the movements as commands. The processor uses the commands to control one or more operational modes of the game play device.
In accordance with an alternate embodiment of the invention, a method of detecting movement of a game play device comprises determining roll, pitch and yaw values using acceleration and gyroscopic data collected within a coordinate frame of a game play device, filtering at least one of the determined roll, pitch and yaw values, compensating for reduced magnitudes of the gyroscopic data when the roll value resides within a transition region, normalizing the accelerometer and gyroscopic data to the determined roll value, excluding yaw values when the game play device exhibits verticality and registering the detected movements as command gestures.
In accordance with an alternate embodiment of the invention, a gesture detection system comprises an inertial measurement unit (IMU) and a processor coupled to the IMU. The processor includes a gesture detection module coupled to the IMU. The gesture detection system further comprises a memory coupled to the gesture detection module. The memory is configured to store a sequence of detected gesture steps and a plurality of gesture step templates. The gesture detection module is configured to determine a first proximity difference between the sequence of detected gesture steps as compared to at least a first portion of the plurality of gesture step templates.
Various aspects and advantages of the invention will become apparent upon review of the following detailed description and upon reference to the drawings in which:
Generally, the various embodiments of the present invention are applied to methods of implementing game play activities and the game play devices for use therewith. In one embodiment, a game play device may genuinely mimic a magic wand by virtually removing from sight any mechanical discontinuities and imperfections thereby better mimicking a genuine article. Furthermore, while the game play device of the present invention includes many lighting, sound and haptic effects that may be adjustable by the user, no visible input/output (I/O) devices may exist for adjustment. Rather, certain external components of the game play device may be manipulated by the user to expose the I/O devices from an otherwise stealthy existence in order to gain access to such I/O devices. Conversely, the need for certain I/O devices may simply be obviated due to the increased controllability (e.g., gesture detection) of the game play device.
The game play device may, for example, employ enhanced motion detection through the use of a single device (e.g., inertial measurement unit (IMU)) and enhanced gesture detection to control both game play and non-game play functionality. Accordingly, the need for more than one detector (e.g., accelerometer or IMU) and the associated relative motion measurements of the multiple detectors is obviated.
Navigation of the multiple menus available within the game play device or assistance during game play may be accomplished through the use of motion-controlled commands generated by one or more user-initiated movements of the game play device as may be detected via absolute one-dimensional, two-dimensional and/or three-dimensional measurements taken from a single device (e.g., IMU or accelerometer) contained within the game play device. As per one example, movements that simulate a two-dimensional drawing (e.g., as if the game play device were being utilized to draw a two-dimensional figure onto an imaginary surface defined by the X-Y, Y-Z and/or Z-X coordinate planes) may be detected by a single IMU of the game play device and translated by a processor contained within the game play device into menu navigation commands that may allow the user of the game play device to traverse various operational modes provided by the game play device and/or to receive audible assistance cues before and during game play. Alternately, certain commands may be initiated by the user via three-dimensional movements (e.g., as if the game play device were being utilized to create a three-dimensional figure within a volume defined by the X-Y-Z coordinate space) that may be similarly detected by a processor located within the game play device using, for example, three-dimensional measurements taken by a single device (e.g., IMU or accelerometer).
Additionally, the IMU may provide information relating to the linear movement (e.g., linear velocity and linear acceleration) of the game play device and/or the rotational movement (e.g., angular velocity and angular acceleration) of the game play device. Accordingly, detection of the absolute position, velocity and/or acceleration parameters used to transcribe the two-dimensional and/or three-dimensional motion commands may be used to provide further control information relative to the operational state of the game play device.
Immersion of the user into fantasy game play may be effectuated by physical cues (e.g., visible, audible and tactile feedback) as generated by the game play device. For example, visible cues (e.g., via light generated by light emitting diodes (LEDs) from an interior of the game play device) may be implemented to provide the user with real-time game play statistics within a given match, such as for example, whether the user sustained a “hit” or blocked a “hit.” Further, the game play device may record statistics relating to the game play, such as the number of “hits” administered by the user, the number “hits” sustained by the user, the number of would-be “hits” successfully shielded by the user and may then provide such statistics to support a final report at the end of a match. Visible, audible and/or tactile cues may also be utilized by the game play device to, for example, provide confirmation of special character selection (e.g., whether the participant wishes to select himself/herself or another participant as a team leader or master wizard) within a given match.
Audible cues may, for example, be utilized by a user of the game play device to obtain audible instructions from the game play device in response to a user-initiated command (e.g., two-dimensional or three-dimensional movements recognized by the game play device as a request from the user for audible instructions before and/or during game play). Alternately, a game play device (e.g., a magic wand) may be waved by the user in proximity to a game play prop (e.g., a dragon egg) that may detect the presence of the magic wand (e.g., via near-field communications (NFC) protocol) and in response communicate wirelessly (e.g., via Wi-Fi, NFC, Bluetooth, Bluetooth mesh or thread-based mesh) to the magic wand, which may then convert the wireless communication to audible cues (e.g., instructions as to how to capture the dragon egg) to the user's advantage during game play (e.g., how to keep the dragon egg from capture by the other participants of the game play).
Tactile cues (e.g., vibration), for example, may be generated by the game play device during game play to further immerse the user into the fantasy game play experience. As per one embodiment, the game play device may vibrate to indicate a transition from sleep mode into standby mode, to indicate the start of a match, to indicate navigation through settings menu items, to cast a spell or any other operational mode indications that may be useful. Operational statistics may further be communicated via vibration to indicate, for example, a “hit” received from an opponent, blocking a “hit” that would have otherwise been received from an opponent, the depletion of life sustenance (e.g., mana) or the receipt of mana.
As per an alternate embodiment, directional gyroscopic forces may be generated from within the game play device that may be utilized to point the game play device along a vector that may indicate the location of a game play prop (e.g., a dragon egg or secret passageway) and thereby guide the user of the game play device toward the game play prop. As per an alternate embodiment, tactile cues (e.g., generated by a vibration motor or piezo-electric device) may be activated from within the game play device to communicate to the user during game play (e.g., a combination of short and long bursts of vibration to generate a Morse Code message) that may then be used by the user to modify his or her actions (e.g., utilize the message as a clue during a scavenger hunt) as a part of game play.
In order to conceal internal electronic devices (e.g., LEDs) from external view of the game play device, light pipes (e.g., fiber-optic cable) may be utilized within an interior of the game play device such that light may be delivered to orifices (e.g., holes and cracks that may naturally exist within a game play device that mimics a magic wand) without disclosing to the user the existence and location of the LED within the game play device. In one embodiment, light (e.g., broadband white light) may be generated from within the game play device and routed via a light pipe to perform a non-game play activity (e.g., providing illumination reminiscent of a flashlight). In an alternate embodiment, light may be delivered to orifices of the game play device such that optically significant attachments (e.g., snap-on diffusers) may be attached to such orifices and may be utilized to alter the emitted light (e.g., alter the direction and/or color of the emitted light) in order to provide game play communication to the user or to provide other special effects during game play. Alternately, LEDs within the game play device may include LEDs capable of emitting programmable colored light.
Ultraviolet (UV) light may be generated from within the game play device and routed (e.g., via light pipe) to perform game play activities (e.g., illuminating invisible ink inscriptions or creating lighting effects in a rave). Alternate devices may, for example, attach to the exterior of the game play device in such a manner as to facilitate receipt of light generated from within the game play device (e.g., infrared (IR) light generated by an LED) and propagate the received light to an exterior of the game play device (e.g., along a shaft of a magic wand) in an optically significant manner (e.g., diffusing/collimating the light into specific light distribution patterns) to, for example, generate short-range wide bursts and long-range narrow bursts, respectively, of IR light. Alternately, IR light distribution patterns may be effectuated simply by selecting IR emitters (e.g., LEDs) with secondary optics designed to generate the desired distribution pattern widths.
Game play devices in accordance with the present invention may not only accumulate game play statistics, but may also communicate such game play statistics to other game play devices during game play and/or at the end of game play. As per one example, wireless communications (e.g., Wi-Fi, NFC, IR, Bluetooth, Bluetooth mesh or thread-based mesh) may be implemented within each game play device and may be utilized to allow the exchange of game play information between participating game play devices during game play to allow such information to be communicated (e.g., via visible, audible and/or tactile feedback) to the users of such game play devices. Such game play information may, for example, motivate those users who may be lagging behind the leading scorers to increase their level and/or quality of game play. Alternatively, game play statistics may only be communicated at the end of game play, so as to increase the suspense that may be gained by delaying the ultimate game play tally.
Turning to
A game play device (e.g., magic wand 100) may, for example, exhibit an overall length 102 of between fifteen to nineteen inches (e.g., approximately 17 inches) or between fifteen to twenty inches (e.g., approximately 18 inches) and may range in girth from between one to two inches in diameter (e.g., approximately 1 inches in diameter) at handle 104 to between one-half inch and one inch in diameter (e.g., approximately ¾ inch diameter) at tip 106. Certain of the exterior features of game play device 100 may be utilized, for example, to conceal various I/O features, such as an on/off button (e.g., capacitive or contact-based switch not shown) and a charging port (e.g., USB-C interface not shown), that may be obscured (e.g., behind barrel 108 and/or cap (pommel) 110) and revealed upon manipulation (e.g., rotation of barrel 108) in order to allow access to such I/O features. Additionally, cap (pommel) 110 may be stylized in the form of a Celtic knot and may be configured to be optionally removed to reveal, for example, a programming/diagnostics port (e.g., a USB-C interface not shown). Alternatively, I/O features may be concealed within the geometry of game play device 100 using covers or surfaces for buttons and ports that match the surface of game play device 100. As per an example, the USB-C interface (not shown) may be hidden beneath a flexible sleeve fitted over handle 104 that may allow the on/off button to be activated without being seen and the USB-C interface to be covered by a portion of the sleeve that may be partially cut out and connected by a living hinge.
In alternate embodiments, the need for certain I/O features may be obviated via detection of movement of the game play device in a particular manner. As discussed in more detail below, for example, detected gestures may instead be used by the game play device while in a sleep mode of operation to transition from sleep mode to active game play mode upon the occurrence of a particular detected gesture.
Game play device 100 may, for example, further include mechanical features that may facilitate audible, tactile and/or visible emissions. As per one example, cap (pommel) 110 may be arranged so as to allow the concealment of orifices that may be used to propagate sound generated from within game play device 100 (e.g., via one or more speakers not shown) that may then be heard by game play participants within proximity to game play device 100. Other orifices (not shown) may exist (e.g., along body portions 104, 112 and/or 114) that may allow the propagation of light generated from within the game play device (e.g., LED light) along a periphery of game play device 100 (e.g., along portions 104, 112 and/or 114), which may then be detected by users that are within proximity of game play device 100 and discerned by those users as coded outputs emitted by game play device 100 (e.g., command acknowledgments or game play statistics).
Game play device 100 may construe detected movements as motion commands, or gestures, that may activate features and provide access to operational menus executed by a processor (not shown) operating within game play device 100. In one embodiment, game play device 100 may include an IMU (not shown) that when combined with gesture detection software executed by a processor (not shown) of game play device 100, may detect such gestures and may cause game play device 100 to behave in accordance with the manner in which the detected gestures may be construed by game play device 100.
Turning to
An on/off switch 228 (e.g., capacitive or continuity-sense switch) may be provided as well so as to activate the game play device for operation. In alternate embodiments, movement of the game play device in a particular manner (e.g., as detected by IMU 208 and construed by gesture detection 212) may instead be used to wake the game play device from a sleep mode of operation and transition the game play device to an operational game play mode.
A single IMU 208 may be provided, which may generate information relating to the orientation and/or movement of a game play device (e.g., game play device 100 of
Processor 206 may include wireless interface 226 (e.g., Wi-Fi, NFC, Bluetooth, Bluetooth mesh or thread-based mesh) that may be used to communicate with other game play devices and/or game play props (e.g., dragon eggs, mana supply crystals, etc.) during game play. Wireless interface 226 may further be utilized to wirelessly communicate firmware updates to processor 206 and/or receive diagnostic information from processor 206. Conversely, an I/O port (e.g., USB-C port 230) may optionally be used instead for such purposes.
Turning to
As per one example, gesture movement 302 may be characterized by starting position 304 followed by upward movement 306 relative to starting position 304 (e.g., as detected by IMU 208 and gesture detection 212 of
Similarly, gesture movement 308 may be characterized by starting position 310 followed by downward movement 312 relative to starting position 310 (e.g., as detected by IMU 208 and gesture detection 212 of
As per another example, gesture movement 314 may be characterized by starting position 316 followed by leftward movement 318 relative to starting position 316 (e.g., as detected by IMU 208 and gesture detection 212 of
Similarly, gesture movement 320 may be characterized by starting position 322 followed by rightward movement 324 relative to starting position 322 (e.g., as detected by IMU 208 and gesture detection 212 of
As per another example, gesture movement 326 may be characterized by starting position 332 followed by alternating left-to-right and right-to-left movements 328 relative to starting position 332 (e.g., as detected by IMU 208 and gesture detection 212 of
Turning to
As per one example, gesture movement 402 may be characterized by starting position 404 followed by unique gesture downward movement indication 406 relative to starting position 404 (e.g., as detected by IMU 208 and gesture detection 212 of
Once processor 206 transitions to settings mode (e.g., via the settings mode command initiated by gesture movement 402 or 402A), the game play device may query the user (e.g., via an audible query issued via speaker 214 of
Gesture movement 410 may be characterized by an imaginary slider 412 that may be moved up and down via the detected movements of certain components of a game play device (e.g., tip 106 of game play device 100 of
Once the user has landed on the last menu selection of the first settings control menu (e.g., volume settings menu), the user may continue to toggle through the remaining settings control menus by, for example, responding to audible prompts issued by the game play device. For example, a second settings control menu (e.g., vibration settings menu) may be prompted by the game play device by asking the user whether haptic feedback is to be enabled. In response, the user may either activate the haptic feedback option by issuing a “select” gesture (e.g., gesture movement 320 of
Other settings menus may control lighting based on their physical position on the game play device (e.g., side and tip lights), whereby the user may adjust brightness and color of the tip lights while further opting to toggle the side lights on or off. In one embodiment, gesture movement 410 may be interpreted (e.g., by gesture detection 212 of
A fourth settings control menu (e.g., color settings menu) may be selected by issuing gesture movement 422 within the surface area defined by imaginary color wheel 424 (e.g., two-dimensional movement within the X-Y, Y-Z or Z-X coordinate planes) may be interpreted (e.g., by gesture detection 212 of
It should be noted, for example, that while the user is pointing the game play device (e.g., game play device 100 of
Turning to
As per another example, gesture movement 504 may be interpreted (e.g., by gesture detection 212 of
It should be noted that by using similar gesture movement detection, a virtually unlimited number of game modes may be effectuated by the detection of the unique gesture movement that may be specific to a particular game mode. As per one example, gesture movement 510 may be interpreted (e.g., by gesture detection 212 of
A subsequent gesture movement (e.g., gesture movements 302 and 308 of
Once the host of the game play has selected the desired game play mode, subsequent gesture movement 514 may then be used by the host game play device to initiate game play. As per one example, once gesture movement 514 is detected (e.g., by gesture detection 212 of
Turning to
In one embodiment, gesture movement 602 may be interpreted (e.g., by gesture detection 212 of
As a result, one or more IR receivers/transceivers (e.g., four IR receivers/transceivers 202 as discussed in more detail below in relation to
In alternate embodiments, real-time feedback may be provided by the game play device, such that the user of the game play device may gauge his or her performance during game play. As per one example, once an IR receiver/transceiver (e.g., one of IR receivers/transceivers 202 of
Similarly, gesture movement 610 may be characterized by starting position 612 followed by circular movement 614 and downward movement 616 relative to starting position 604 (e.g., as detected by IMU 208 and gesture detection 212 of
As a result, one or more IR receivers/transceivers (e.g., four IR receivers/transceivers 202 as discussed in more detail below in relation to
After game play terminates, the contents of memory 232 may be downloaded (e.g., via wireless interface 226 of all participating game play devices) into a central game repository, which may then be analyzed to determine the game statistics (e.g., the participant who administered the highest number of “fireballs” registered by other game participants), which may then be signaled (e.g., audibly and/or visibly) by the game play device as to the individual achievement. If the match play required teams, then the winning team's game play devices may illuminate with victory colors and sounds, while the losing team's game play devices may be muted and remain unlit for an amount of time.
In alternate embodiments, real-time feedback may be provided by the game play device such that the user of the game play device may gauge his or her performance during game play. As per one example, once an IR receiver/transceiver (e.g., one of IR receivers/transceivers 202 of
Gesture movement 620 may, for example, be used by the user to effectuate a “shield” gesture that may be effective to shield the user from effective hits that may be scored by the spells of other game play participants. Shield gesture 620 may be initiated, for example, by transcribing an arc (e.g., arc 622) with a game play device originating from starting position 624 (e.g., to the left of the user) and terminating at ending position 626 (e.g., to the right of the user) substantially within a single plane (e.g., X-Y plane 628). Shield gesture 620 may then be completed by transcribing an arc (e.g., arc 630) with a game play device originating from starting position 626 (as a continuing movement from the end of arc 622) and terminating at ending position 624 (after which the game play device is held motionless for a few seconds) substantially within a single plane (e.g., X-Z plane 632) that may be substantially perpendicular to plane 628. During game play, feedback (e.g., audible, visible and/or haptic effects) may be used to differentiate between hits on shields versus hits on unshielded game play participants. Blocking with a shield may yield increased points for the defender with decreased points for the corresponding attacker. In alternate game play modes, the shield may simply prevent loss of life or life points.
As discussed above, gesture movements of a game play device (e.g., as directed by a user of game play device 100 of
Turning to
Movement(s) of the game play device may be construed within its coordinate frame as simple gestures that may consist of a single orientation followed by a single motion such as those discussed above, for example, in relation to gestures 302, 308, 314 and 320 of
Gesture detection data 752, 754 and 756 may be received periodically (e.g., at a 400 cycle per second polling rate via processor 206 of
Table 1 exemplifies mapping of data received from accelerometer 758 (e.g., Data Types 1-3 of Table 1) and gyroscope 760 (e.g., Data Types 4-6 of Table 1) into perceived movements of the game play device.
For example, a game play device (e.g., game play device 100 of
A game play device (e.g., game play device 100 of
A game play device (e.g., game play device 100 of
Angular velocity data may be provided by gyroscope 760 and may be utilized by gesture detection 766 to sense a rate of change in angular position of a game play device (e.g., game play device 100 of
Alternately, angular velocity data may be provided by gyroscope 760 and may be utilized by gesture detection 766 to sense a rate of change in angular position of a game play device (e.g., game play device 100 of
IMU 208 may be capable of expressing orientations of a game play device in the form of quaternions that may be generated by fusing accelerometer 758 and gyroscope 760 output data 752 and 754, respectively. Equations (1)-(3) represent expressions stated in terms of such quaternions, which when resolved by gesture detection 766 (e.g., as discussed above in relation to gesture detection 212 of
pitch=−sin(2*(quadi*quatk−quatr*quatj)) (1)
roll=tan−1(2*(quatr*quadi+quatj*quatk),quatr*quatr−quadi*quadi−quatj*quatj+quatk*quatk) (2)
yaw=tan−1(2*(quadi*quatj+quatr*quatk),quatr*quatr+quadi*quadi−quatj*quatj−quatk*quatk) (3)
In order to mitigate the effects of erroneous or extreme variation in the data that may be received from accelerometer 758 and gyroscope 760, an exponential moving average (EMA) filter may be utilized, the code snippet of which may be exemplified by the for-loop of equation (5):
The values computed by equations (1)-(3) above may similarly be subjected to a variation of the EMA filter of equation (5) whose operation and execution within gesture detection 766 may be exemplified by the pseudo-code of equation (6) as discussed in more detail below:
The need for equation (6) may be required when the value computed for pitch (e.g., as in equation (1) above) indicates that the game play device is oriented in a vertical or near vertical position (i.e., verticality) thereby rendering the roll, pitch and/or yaw values computed by equations (1)-(3) unreliable.
The pseudo-code exemplified by equation (6) may be further useful when modulus arithmetic is necessary to describe the orientation of the game play device as it traverses modulus-if boundaries within the coordinate frame of
As discussed above in relation to equation (2), roll values may be calculated and subsequently used by gesture detection 766 to determine a magnitude of roll exhibited by a game play device about axis 706, such that in one embodiment a roll value about axis 706 may be calculated to be within 0 to 360 degrees (0 to 2π radians) in direction 716 or conversely a roll value about axis 706 may be calculated to be within −0 to −360 degrees (−0 to −2π radians) in direction 714. As per one example, the game play device may be held by a user such that one end of the game play device (e.g., tip 106 of game play device 100 of
It can be seen, however, that as the roll value changes about axis 706 in either direction 714 or 716, vectors 702 and 704 must also rotate in equal proportion and direction as compared to the value of roll so as to maintain the orthogonality of axes 702, 704 and 706. In so doing, absolute motion as detected by gesture detection 766 may be made to appear fixed within the game play device's frame of reference by modifying the direction of vectors 702 and 704 in proportion to the roll value thereby substantially negating the roll value (e.g., a process described herein as normalization).
As per one example, a calculation of roll per equation (2) by gesture detection 766 may indicate that the game play device has increased its roll value from 0 to 90 degrees along vector 706 (e.g., pointing backward over the user's shoulder) in direction 714 (e.g., a roll value that as discussed in more detail below may exist within rightside roll quadrant 804 of
Accordingly, linear acceleration as measured by accelerometer 758 along vector 702 (+ACC702) may be normalized by gesture detection 766 to the acceleration value as measured along vector 704 (+ACC704) because the acceleration vector rotates from an axis parallel to vector 702 to an axis parallel to vector 704 due to the 90-degree change in roll value to rightside roll quadrant 804. Furthermore, linear acceleration as measured by accelerometer 758 along vector 704 may be normalized by gesture detection 766 to an inverted acceleration value as measured opposite vector 702 (−ACC702) because the acceleration vector rotates from an axis parallel to vector 704 to an axis parallel, but opposite to, vector 702. The corresponding angular velocity measurements about axes 702 and 704, +GYR702 and +GYR704, respectively, may similarly be normalized by gesture detection 766 to +GYR704 and −GYR702, respectively, due to the 90-degree change in roll value to rightside roll quadrant 804.
Turning to
Through the process of normalization, absolute motion of the game play device may be detected as if the game play device was in upside roll quadrant 802 regardless of the game play device's actual roll value. As per one example, the roll value of a game play device may have increased by 180 degrees from upside roll quadrant 802 in either direction 714 or 716 to downside roll quadrant 808 as determined by equation (2). As such, gesture detection 766 may normalize accelerometer data 752 and gyroscope data 754 to downside roll quadrant 808 by inverting accelerometer data 752 and gyroscope data 754 per Table 2 so that any movements of the game play device may be registered as if the game play device was oriented in upside quadrant 802 despite its actual orientation in downside quadrant 808.
Operation of the normalization process across roll quadrant boundaries may require modification of the minimum and maximum requirements for gyroscope data 754 as received from gyroscope 760 prior to movement detection. In particular, if the game play device exhibits a roll value that resides within any one of transition regions 810-817, then gyroscopic data 754 may tend to exhibit significantly reduced magnitudes (e.g., about 50% reduction) as compared to the magnitudes of gyroscopic data 754 generated when the game play device's roll value does not reside within any one of transition regions 810-817. Accordingly, proper manipulation of these reduced gyroscopic data magnitudes may be required to increase the accuracy of the corresponding gesture detection as discussed in more detail below.
Turning to
Table 3 lists exemplary step movements performed by a user while handling a game play device (e.g., game play device 100 of
Prior to the execution of decision step 902, a determination of the game play device's roll value may be made to determine whether the roll value resides within any one of transition regions 810-817 as discussed above in relation to
In one embodiment, for example, once accelerometer data 752 and gyroscope data 754 have been filtered in accordance with equation (5), minimum and maximum values for each may be tracked across a timespan (e.g., approximately 100 ms) and stored for processing. Accordingly, while a user is subjecting the game play device to movement, only the extreme values of the filtered accelerometer data 752 and gyroscope data 754 may be tracked during the movement. Any of the min/max values that meet magnitude thresholds for any particular movement (e.g., a simple or complex gesture) may then be registered. Once registered, the tracked min/max values may be reset to zero in preparation for detection of the next movement.
Min/max values may be tracked for longer periods of time if, for example, a particular movement cannot be characterized as either a simple or a complex gesture (e.g., the movement involves multiple directions but is not made up of multiple steps). In this instance, extreme values of filtered accelerometer data 752 and gyroscope data 754 may be tracked for a longer duration adequate to detect such a hybrid movement (e.g., overhead gesture 514 of
Further preprocessing may be required prior to execution of decision step 902 when the vertical facing steps of Table 4 are to be detected. As discussed above in relation to equation (3), for example, yaw values may be unreliable when the game play device is placed into a vertical, or near vertical, orientation. As such, changes in yaw are stored prior to entering a vertical orientation and subsequent to exiting a vertical orientation and are then subsequently compared and used during decision step 902.
During execution of decision step 902, pitch and yaw values of equations (1) and (3), respectively, may be filtered and stored in respective pitch and yaw queues so as to determine a general trend in magnitude and direction while minimizing the effects of spurious data. Accordingly, a determination may be made as to whether magnitudes of pitch and yaw values are increasing, decreasing or substantially stable and whether directions of pitch and yaw values are changing. Furthermore, the rates of change of both pitch and yaw may be calculated from the data contained within the pitch and yaw queues to determine which quantity is changing more with respect to time.
The roll, pitch and yaw magnitude/directional processing as discussed above may be useful to reduce certain spurious effects that may be experienced during gesture detection. For example, EMA filtering may be used to remove spurious directional data (e.g., direction of movement reported opposite to the actual direction of movement) that may result from deceleration of movement of a game play device. Yaw and pitch rate processing may be useful, for example, when confirming that purely left/right movements include greater yaw rates than pitch rates. Absolute values of pitch may be used, for example, to accurately determine the verticality of the game play device thereby producing knowledge that the yaw calculations of equation (3) may yield anomalous results should verticality exist.
As per an example for clarification of the discussion above, the code snippet of Table 5 exemplifies step processing checks that may be used by gesture detection 766 during decision 902 of
Once Step Up Left decision 902 passes the checks of Table 5, the Step Up Left step may be added to the step queue as in process 904, but only if the step has not been previously detected within an amount of time (e.g., 100 ms). As discussed above, for example, a 100 ms step timer may be set each time a step has been detected such that any duplicate steps detected before the step timer has expired may not be further added to the step queue to avoid redundant detection of steps.
If the detected step does not form an intermediate step of a simple gesture (e.g., the Step Up Left movement will not be found to form any portion of simple gestures 302, 308, 314 and 320 of
Once stabilized, decision 912 parses through the step queue to determine whether any of the detected steps can be linked to form a complex gesture. If so, then the complex gesture may be registered as in process 914 and any previously detected simple gestures (e.g., as detected by decision 906) may then be cleared from the simple gesture queue (e.g., as added to by process 908). If no complex gestures may be registered, then decision 916 may determine whether any simple gestures have been detected and stored. If so, then the simple gesture may be registered as in process 918. The step queue may be emptied as in process 920, which then returns gesture detection processing to step detection as in decision 902.
Turning to
Each series of preselected sequences 1001-1009 and 1051-1058 may, for example, be stored within a memory of a game play device (e.g., memory 232 as discussed above in relation to
In one embodiment, such proximity differences may be characterized as weighted Levenshtein Distances and stored into a table that may itemize the degree of error that may exist between each user input step and each corresponding template step. As per an example, the weighted Levenshtein Distance may indicate a high degree of error between a user implemented “Step Right” component when compared to a corresponding “Step Left” template component. Conversely, the weighted Levenshtein Distance may indicate a low degree of error between a user implemented “Step Right” component when compared to a corresponding “Step Up Right” template component. Further, weighted Levenshtein Distances may also be recorded for missing and/or additional user input steps.
Once the step queue of detected user input steps has been compared and scored against all templates, the template comparison resulting in the best score (e.g., the lowest Levenshtein Distance score) may be selected as the template that most likely represents the complex gesture intended by the user. The best score may then be compared to a threshold score to determine whether the complex gesture is to be registered (e.g., as discussed above in relation to process 914 of
It should be noted that decision 912 of
In one embodiment, the movements constituting simple and/or complex gestures may be reversed or mirrored so as to increase the efficacy for specialty (e.g., left-handed) users who may tend to gesture in directions that may be opposite to those of right-handed users. Further, the simple and/or complex gestures may be detected using less rigid detection rules for certain gestures. As per one example, gesture 504 (as discussed below in relation to
In alternate embodiments, the game play device may include a machine-teaching mode that may utilize both simple and complex gesture templates stored within the game play device's memory to teach a user of the game play device how to optimize their movements when attempting to issue gesture commands. For example, the game play device may enter the machine-teaching mode (e.g., via a voice command issued by the user) such that a voice to text converter (e.g., voice transducer 218 as discussed above in relation to
The user may then verbally communicate his/her specific learning requirement (e.g., “teach me how to cast a fireball spell”), which then may cause the game play device to recall the template of step components associated with a “fireball” gesture. The game play device may then direct (e.g., via audible, visible or tactile feedback) the user to emulate each step component of the requested gesture by progressively maneuvering the game play device through each step. As the steps progress, the game play device may provide constructive feedback (e.g., audibly, visibly or via haptics) to allow the user to continuously improve his or her skill at effectuating the requested gesture. Such a machine-teaching mode may be made to be active at any time including during game play to correct and/or instruct gesture technique. As such, the machine teaching may act as an artificially intelligent (AI) assistant during the game play to improve the user's performance.
Turning to
Battery 1130 (e.g., as discussed above in relation to battery 220 of
Turning to
Turning to
Turning to
UV light transmission may, for example, be accomplished via UV emitter 1404 and light pipe 1402 such that UV light emitted by 1404 in a direction that is normal to side 1422 of PCBA 1124 may be emitted from the game play device in a direction that is parallel to side 1422, which in one embodiment effectuates UV light transmission from a tip of the game play device (e.g., tip 106 as discussed above in relation to
Other aspects and embodiments of the present invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. For example, the game play device may be implemented with virtually any form factor (e.g., relay baton) so as to facilitate portability. It is intended, therefore, that the specification and illustrated embodiments be considered as examples only, with a true scope and spirit of the invention being indicated by the following claims.
Number | Date | Country | |
---|---|---|---|
63350837 | Jun 2022 | US |