The present disclosure relates generally to video game systems and, more specifically, to an interactive video game system that enables simultaneous multi-player game play.
Video game systems generally enable players to control characters in a virtual environment to achieve predefined goals or objectives. Traditional video game systems generally rely on manual input devices, such as joysticks, game controllers, keyboards, and so forth, to enable players to control characters within the virtual environment of the game. Additionally, certain modern video game systems can include a camera capable of tracking the movements of players, enabling players to control video game characters based on their movements. However, these systems typically suffer from issues with occlusion, in which a portion of a player is at least temporarily obscured from the camera and, as a result, the system is no longer able to accurately track the position or movements of the player. For example, occlusion can cause jittering or stuttering in the movements of the characters in the virtual environment, as well as other imprecise or erroneous translation of player actions into character actions within the game. Additionally, for multi-player video game systems, the potential for occlusion dramatically increases with the number of players.
Present embodiments are directed to an interactive video game system that includes an array of volumetric sensors disposed around a play area that is configured to collect respective volumetric data for each of a plurality of players. The system includes a controller communicatively coupled to the array of volumetric sensors. The controller is configured to receive, from the array of volumetric sensors, respective volumetric data of each of the plurality of players. The controller is configured to combine the respective volumetric data of each of the plurality of players to generate at least one respective model for each of the plurality of players. The controller is also configured to generate a respective virtual representation for each player of the plurality of players based, at least in part, on the generated at least one respective model of each player of the plurality of players. The controller is further configured to present the generated respective virtual representation of each player of the plurality of players in a virtual environment.
Present embodiments are also directed to an interactive video game system having a display device disposed near a play area and configured to display a virtual environment to a plurality of players in the play area. The system includes an array of sensing units disposed around the play area, wherein each sensing unit of the array is configured to determine a partial model of at least one player of the plurality of players. The system also includes a controller communicatively coupled to the array of sensing units. The controller is configured to: receive, from the array of sensing units, the partial models of each player of the plurality of players; generate a respective model of each player of the plurality of players by fusing the partial models of each player of the plurality of players; determine in-game actions of each player of the plurality of players based, at least in part, on the generated respective model of each player of the plurality of players; and display a respective virtual representation of each player of the plurality of players in the virtual environment on the display device based, at least in part, on the generated respective model and the in-game actions of each player of the plurality of players.
Present embodiments are also directed to a method of operating an interactive video game system. The method includes receiving, via processing circuitry of a controller of the interactive video game system, partial models of a plurality of players positioned within a play area from an array of sensing units disposed around the play area. The method includes fusing, via the processing circuitry, the received partial models of each player of the plurality of players to generate a respective model of each player of the plurality of players. The method includes determining, via the processing circuitry, in-game actions of each player of the plurality of players based, at least in part, on the generated respective models of each player of the plurality of players. The method also includes presenting, via a display device, a virtual representation of each player of the plurality of players in a virtual environment based, at least in part, on the generated respective model and the in-game actions of each player of the plurality of players.
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
As used herein, a “volumetric scanning data” refers to three-dimensional (3D) data, such as point cloud data, collected by optically measuring (e.g., imaging, ranging) visible outer surfaces of players in a play area. As used herein, a “volumetric model” is a 3D model generated from the volumetric scanning data of a player that generally describes the outer surfaces of the player and may include texture data. A “shadow model,” as used herein, refers to a texture-less volumetric model of a player generated from the volumetric scanning data. As such, when presented on a two-dimensional (2D) surface, such as a display device, the shadow model of a player has a shape substantially similar to a shadow or silhouette of the player when illuminated from behind. A “skeletal model,” as used herein, refers to a 3D model generated from the volumetric scanning data of a player that defines predicted locations and positions of certain bones (e.g., bones associated with the arms, legs, head, spine) of a player to describe the location and pose of the player within a play area.
Present embodiments are directed to an interactive video game system that enables multiple players (e.g., up to 12) to perform actions in a physical play area to control virtual representations of the players in a displayed virtual environment. The disclosed interactive video game system includes an array having two or more volumetric sensors, such as depth cameras and Light Detection and Ranging (LIDAR) devices, capable of volumetrically scanning each of the players. The system includes suitable processing circuitry that generates models (e.g., volumetric models, shadow models, skeletal models) for each player based on the volumetric scanning data collected by the array of sensors, as discussed below. During game play, at least two volumetric sensors capture the actions of the players in the play area, and the system determines the nature of these actions based on the generated player models. Accordingly, the interactive video game system continuously updates the virtual representations of the players and the virtual environment based on the actions of the players and their corresponding in-game effects.
As mentioned, the array of the disclosed interactive video game system includes multiple volumetric sensors arranged around the play area to monitor the actions of the players within the play area. This generally ensures that a skeletal model of each player can be accurately generated and updated throughout game play despite potential occlusion from the perspective of one or more volumetric sensors of the array. Additionally, the processing circuitry of the system may use the volumetric scanning data to generate aspects (e.g., size, shape, outline) of the virtual representations of each player within the virtual environment. In certain embodiments, certain aspects (e.g., color, texture, scale) of the virtual representation of each player may be further adjusted or modified based on information associated with the player. As discussed below, this information may include information related to game play (e.g., items acquired, achievements unlocked), as well as other information regarding activities of the player outside of the game (e.g., player performance in other games, items purchased by the player, locations visited by the player). Furthermore, the volumetric scanning data collected by the array of volumetric sensors can be used by the processing circuitry of the game system to generate additional content, such as souvenir images in which a volumetric model of the player is illustrated as being within the virtual world. Accordingly, the disclosed interactive video game system enables immersive and engaging experience for multiple simultaneous players.
With the foregoing in mind,
The embodiment of the interactive video game system 10 illustrated in
In the illustrated embodiment, each sensing unit 38 includes a respective volumetric sensor 40, which may be an infra-red (IR) depth camera, a LIDAR device, or another suitable ranging and/or imaging device. For example, in certain embodiments, all of the volumetric sensors 40 of the sensing units 38 in the array 36 are either IR depth cameras or LIDAR devices, while in other embodiments, a mixture of both IR depth cameras and LIDAR devices are present within the array 36. It is presently recognized that both IR depth cameras and LIDAR devices can be used to volumetrically scan each of the players 12, and the collected volumetric scanning data can be used to generate various models of the players, as discussed below. For example, in certain embodiments, IR depth cameras in the array 36 may be used to collect data to generate skeletal models, while the data collected by LIDAR devices in the array 36 may be used to generate volumetric and/or shadow models of the players 12, which is discussed in greater detail below. It is also recognized that LIDAR devices, which collect point cloud data, are generally capable of scanning and mapping a larger area than depth cameras, typically with better accuracy and resolutions. As such, in certain embodiments, at least one sensing unit 38 of the array 36 includes a corresponding volumetric sensor 40 that is a LIDAR device to enhance the accuracy or resolution of the array 36 and/or to reduce a total number of sensing units 38 present in the array 36.
Further, each illustrated sensing unit 38 includes a sensor controller 42 having suitable memory circuitry 44 and processing circuitry 46. The processing circuitry 46 of each sensor unit 38 executes instructions stored in the memory circuitry 44 to enable the sensing unit 38 to volumetrically scan the players 12 to generate volumetric scanning data for each of the players 12. For example, in the illustrated embodiment, the sensing units 38 are communicatively coupled to the primary controller 34 via a high-speed internet protocol (IP) network 48 that enables low-latency exchange of data between the devices of the interactive video game system 10. Additionally, in certain embodiments, the sensing units 38 may each include a respective housing that packages the sensor controller 42 together with the volumetric sensor 40.
It may be noted that, in other embodiments, the sensing units 38 may not include a respective sensor controller 42. For such embodiments, the processing circuitry 35 of the primary controller 34, or other suitable processing circuitry of the system 10, is communicatively coupled to the respective volumetric sensors 40 of the array 36 to provide control signals directly to, and to receive data signals directly from, the volumetric sensors 40. However, it is presently recognized that processing (e.g., filtering, skeletal mapping) the volumetric scanning data collected by each of these volumetric sensors 40 can be processor-intensive. As such, in certain embodiments, it can be advantageous to divide the workload by utilizing dedicated processors (e.g., processors 46 of each of the sensor controllers 42) to process the volumetric data collected by the respective sensor 40, and then to send processed data to the primary controller 34. For example, in the illustrated embodiment, each of processors 46 of the sensor controllers 42 process the volumetric scanning data collected by their respective sensor 40 to generate partial models (e.g., partial volumetric models, partial skeletal models, partial shadow models) of each of the players 12, and the processing circuitry 35 of the primary controller 34 receives and fuses or combines the partial models to generate complete models of each of the players 12, as discussed below.
Additionally, in certain embodiments, the primary controller 34 may also receive information from other sensing devices in and around the play area 16. For example, the illustrated primary controller 34 is communicatively coupled to a radio-frequency (RF) sensor 45 disposed near (e.g., above, below, adjacent to) the 3D play area 16A. The illustrated RF sensor 45 receives a uniquely identifying RF signal from a wearable device 47, such as a bracelet or headband having a radio-frequency identification (RFID) tag worn by each of the players 12. In response, the RF sensor 45 provides signals to the primary controller 34 regarding the identity and the relative positions of the players 12 in the play area 16. As such, for the illustrated embodiment, processing circuitry 35 of the primary controller 34 receives and combines the data collected by the array 36, and potentially other sensors (e.g., RF sensor 45), to determine the identities, locations, and actions of the players 12 in the play area 16 during game play. Additionally, the illustrated primary controller 34 is communicatively coupled to a database system 50, or any other suitable data repository storing player information. The database system 50 includes processing circuitry 52 that executes instructions stored in memory circuitry 54 to store and retrieve information associated with the players 12, such as player models (e.g., volumetric, shadow, skeletal), player statistics (e.g., wins, losses, points, total game play time), player attributes or inventory (e.g., abilities, textures, items), player purchases at a gift shop, player points in a loyalty rewards program, and so forth. The processing circuitry 35 of the primary controller 34 may query, retrieve, and update information stored by the database system 50 related to the players 12 to enable the system 10 to operate as set forth herein.
Additionally, the embodiment of the interactive video game system 10 illustrated in
Additionally, the embodiment of the interactive video game system 10 illustrated in
As illustrated in
As mentioned, the array 36 illustrated in
As mentioned, the volumetric scanning data collected by the array 36 of the interactive video game system 10 can be used to generate various models (e.g., volumetric, shadow, skeletal) for each player. For example,
Additionally, embodiments of the interactive video game system 10 having the 3D play area 16A, as illustrated in
It may be noted that, for embodiments of the interactive game system 10 having the 3D player area 16A, as represented in
For comparison,
It is presently recognized that embodiments of the interactive video game system 10 that utilize a 2D play area 16B, as represented in
Accordingly, it is recognized that the smaller array 36 of sensing units 38 used by embodiments of the interactive video game system 10 having the 2D play area 16B also generate considerably less data to be processed than embodiments having the 3D play area 16A. For example, because occlusion between players 12 is significantly more limited and predictable in the 2D play area 16B of
As mentioned, the interactive video game system 10 is capable of generating various models of the players 12. More specifically, in certain embodiments, the processing circuitry 35 of the primary controller 34 is configured to receive partial model data (e.g., partial volumetric, shadow, and/or skeletal models) from the various sensing units 38 of the array 36 and fuse the partial models into complete models (e.g., complete volumetric, shadow, and/or skeletal models) for each of the players 12. Set forth below is an example in which the processing circuitry 35 of the primary controller 34 fuses partial skeletal models received from the various sensing units 38 of the array 36. It may be appreciated that, in certain embodiments, the processing circuitry 35 of the primary controller 34 may use a similar process to fuse partial shadow model data into a shadow model and/or to fuse partial volumetric model data.
In an example, partial skeletal models are generated by each sensing unit 38 of the interactive video game system 10 and are subsequently fused by the processing circuitry 35 of the primary controller 34. In particular, the processing circuitry 35 may perform a one-to-one mapping of corresponding bones of each of the players 12 in each of the partial skeletal models generated by different sensing units 38 positioned at different angles (e.g., opposite sides, perpendicular) relative to the play area 16. In certain embodiments, relatively small differences between the partial skeletal models generated by different sensing units 38 may be averaged when fused by the processing circuitry 35 to provide smoothing and prevent jerky movements of the virtual representations 14. Additionally, when a partial skeletal model generated by a particular sensing unit differs significantly from the partial skeletal models generated by at least two other sensing units, the processing circuitry 35 of the primary controller 34 may determine the data to be erroneous and, therefore, not include the data in the skeletal models 80. For example, if a particular partial skeletal model is missing a bone that is present in the other partial skeletal models, then the processing circuitry 35 may determine that the missing bone is likely the result of occlusion, and may discard all or some of the partial skeletal model in response.
It may be noted that precise coordination of the components of the interactive game system 10 is desirable to provide smooth and responsive movements of the virtual representations 14 in the virtual environment 32. In particular, to properly fuse the partial models (e.g., partial skeletal, volumetric, and/or shadow models) generated by the sensing units 38, the processing circuitry 35 may consider the time at which each of the partial models is generated by the sensing units 38. In certain embodiments, the interactive video game system 10 may include a system clock 100, as illustrated in
The illustrated embodiment of the process 110 begins with the interactive game system 10 collecting (block 112) a volumetric scanning data for each player. In certain embodiments, as illustrated in
Next, the interactive video game system 10 generates (block 114) corresponding models for each player based on the volumetric scanning data collected for each player. As set forth above, in certain embodiments, the processing circuitry 35 of the primary controller may receive partial models for each of the players from each of the sensing units 38 in the array 36, and may suitably fuse these partial models to generate suitable models for each of the players. For example, the processing circuitry 35 of the primary controller 34 may generate a volumetric model for each player that generally defines a 3D shape of each player. Additionally or alternatively, the processing circuitry 35 of the primary controller 34 may generate a shadow model for each player that generally defines a texture-less 3D shape of each player. Furthermore, the processing circuitry 35 may also generate a skeletal model that generally defines predicted skeletal positions and locations of each player within the play area.
Continuing through the example process 110, next, the interactive video game system 10 generates (block 116) a corresponding virtual representation for each player based, at least in part on, the on the volumetric scanning data collected for each player and/or one or more the models generated for each player. For example, in certain embodiments, the processing circuitry 35 of the primary controller 34 may use a shadow model generated in block 114 as a basis to generate a virtual representation of a player. It may be appreciated that, in certain embodiments, the virtual representations 14 may have a shape or outline that is substantially similar to the shadow model of the corresponding player, as illustrated in
It may be noted that, in certain embodiments, the virtual representations 14 of the players 12 may not have an appearance or shape that substantially resembles the generated volumetric or shadow models. For example, in certain embodiments, the interactive video game system 10 may include or be communicatively coupled to a pre-generated library of virtual representations that are based on fictitious characters (e.g., avatars), and the system may select particular virtual representations, or provide recommendations of particular selectable virtual representations, for a player generally based on the generated volumetric or shadow model of the player. For example, if the game involves a larger hero and a smaller sidekick, the interactive video game system 10 may select or recommend from the pre-generated library a relatively larger hero virtual representation for an adult player and a relatively smaller sidekick virtual representation for a child player.
The process 110 continues with the interactive video game system 10 presenting (block 118) the corresponding virtual representations 14 of each of the players in the virtual environment 32 on the display device 24. In addition to presenting, in certain embodiments, the actions in block 118 may also include presenting other introductory presentations, such as a welcome message or orientation/instructional information, to the players 12 in the play area 16 before game play begins. Furthermore, in certain embodiments, the processing circuitry 35 of the primary controller 34 may also provide suitable signals to set or modify parameters of the environment within the play area 16. For example, these modifications may include adjusting house light brightness and/or color, playing game music or game sound effects, adjusting the temperature of the play area, activating physical effects in the play area, and so forth.
Once game play begins, the virtual representations 14 generated in block 116 and presented in block 118 are capable of interacting with one another and/or with virtual objects (e.g., virtual objects 92 and 94) in the virtual environment 32, as discussed herein with respect to
The process 130 of
For the illustrated embodiment of the process 130, after receiving the partial models from the sensing units 38, the processing circuitry 35 fuses the partial models to generate (block 134) updated models (e.g., volumetric, shadow, and/or skeletal) for each player based on the received partial models. For example, the processing circuitry 35 may update a previously generated model, such as an initial skeletal model generated in block 114 of the process 110 of
Next, the illustrated process 130 continues with the processing circuitry 35 identifying (block 136) one or more in-game actions of the corresponding virtual representations 14 of each player 12 based, at least in part, on the updated models of the players generated in block 134. For example, the in-game actions may include jumping, running, sliding, or otherwise moving of the virtual representations 14 within the virtual environment 32. In-game actions may also include interacting with (e.g., moving, obtaining, losing, consuming) an item, such as a virtual object in the virtual environment 32. In-game actions may also include completing a goal, defeating another player, winning a round, or other similar in-game actions.
Next, the processing circuitry 35 may determine (block 138) one or more in-game effects triggered in response to the identified in-game actions of each of the players 12. For example, when the determined in-game action is a movement of a player, then the in-game effect may be a corresponding change in position of the corresponding virtual representation within the virtual environment. When the determined in-game action is a jump, the in-game effect may include moving the virtual representation along the y-axis 20, as illustrated in
The illustrated process 130 continues with the processing circuitry 35 generally updating the presentation to the players in the play area 16 based on the in-game actions of each player and the corresponding in-game effects, as indicated by bracket 122. In particular, the processing circuitry 35 updates (block 140) the corresponding virtual representations 14 of each of the players 12 and the virtual environment 32 based on the updated models (e.g., shadow and skeletal models) of each player generated in block 134, the in-game actions identified in block 136, and/or the in-game effects determined in block 138, to advance game play. For example, for the embodiments illustrated in
Additionally, the processing circuitry 35 may provide suitable signals to generate (block 142) one or more sounds and/or one or more physical effects (block 144) in the play area 16 based, at least in part, on the determined in-game effects. For example, when the in-game effect is determined to be a particular virtual representation of a player crashing into a virtual pool, the primary controller 34 may cause the output controller 56 to signal the speakers 62 to generate suitable splashing sounds and/or physical effects devices 78 to generate a blast of mist. Additionally, sounds and/or physical effects may be produced in response to any number of in-game effects, including, for example, gaining a power-up, lowing a power-up, scoring a point, or moving through particular types of environments. Mentioned with respect to
Furthermore, it may be noted that the interactive video game system 10 can also enable other functionality using the volumetric scanning data collected by the array 36 of volumetric sensors 38. For example, as mentioned, in certain embodiments, the processing circuitry 35 of the primary controller 34 may generate a volumetric model that that includes both the texture and the shape of each player. At the conclusion of game play, the processing circuitry 35 of the primary controller 34 can generate simulated images that use the volumetric models of the players to render a 3D likeness of the player within a portion of the virtual environment 32, and these can be provided (e.g., printed, electronically transferred) to the players 12 as souvenirs of their game play experience. For example, this may include a print of a simulated image illustrating the volumetric model of a player crossing a finish line within a scene from the virtual environment 32.
The technical effects of the present approach includes an interactive video game system that enables multiple players (e.g., two or more, four or more) to perform actions in a physical play area (e.g., a 2D or 3D play area) to control corresponding virtual representations in a virtual environment presented on a display device near the play area. The disclosed system includes a plurality of sensors and suitable processing circuitry configured to collect volumetric scanning data and generate various models, such as volumetric models, shadow models, and/or skeletal models, for each player. The system generates the virtual representations of each player based, at least in in part, on a generated player models. Additionally, the interactive video game system may set or modify properties, such as size, texture, and/or color, of the of the virtual representations based on various properties, such as points, purchases, power-ups, associated with the players.
While only certain features of the present technique have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the present technique. Additionally, the techniques presented and claimed herein are referenced and applied to material objects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for [perform]ing [a function] . . . ” or “step for [perform]ing [a function] . . . ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).
Number | Name | Date | Kind |
---|---|---|---|
5946444 | Evans et al. | Aug 1999 | A |
6142368 | Mullins et al. | Nov 2000 | A |
6307952 | Dietz | Oct 2001 | B1 |
6346886 | De La Huerga | Feb 2002 | B1 |
6352205 | Mullins et al. | Mar 2002 | B1 |
6474557 | Mullins et al. | Nov 2002 | B2 |
6507353 | Huard et al. | Jan 2003 | B1 |
6526158 | Goldberg | Feb 2003 | B1 |
6634949 | Briggs et al. | Oct 2003 | B1 |
6680707 | Allen et al. | Jan 2004 | B2 |
6761637 | Weston et al. | Jul 2004 | B2 |
6822569 | Bellum et al. | Nov 2004 | B1 |
6888502 | Beigel et al. | May 2005 | B2 |
6908387 | Hedrick et al. | Jun 2005 | B2 |
6921332 | Fukunaga | Jul 2005 | B2 |
6967566 | Weston et al. | Nov 2005 | B2 |
7029400 | Briggs | Apr 2006 | B2 |
7047205 | Hale et al. | May 2006 | B2 |
7066781 | Weston | Jun 2006 | B2 |
7071914 | Marks | Jul 2006 | B1 |
7204425 | Mosher, Jr. et al. | Apr 2007 | B2 |
7224967 | Hale et al. | May 2007 | B2 |
7311605 | Moser | Dec 2007 | B2 |
7327251 | Corbett, Jr. | Feb 2008 | B2 |
7336178 | Le | Feb 2008 | B2 |
7336185 | Turner et al. | Feb 2008 | B2 |
7385498 | Dobosz | Jun 2008 | B2 |
7396281 | Mendelsohn et al. | Jul 2008 | B2 |
7400253 | Cohen | Jul 2008 | B2 |
7445550 | Barney et al. | Nov 2008 | B2 |
7479886 | Burr | Jan 2009 | B2 |
7488231 | Weston | Feb 2009 | B2 |
7492254 | Bandy et al. | Feb 2009 | B2 |
7500917 | Barney et al. | Mar 2009 | B2 |
7528729 | Light et al. | May 2009 | B2 |
7541926 | Dugan et al. | Jun 2009 | B2 |
7564360 | Cote et al. | Jul 2009 | B2 |
7564426 | Poor et al. | Jul 2009 | B2 |
7606540 | Yoon | Oct 2009 | B2 |
7614958 | Weston et al. | Nov 2009 | B2 |
7642921 | Cutler et al. | Jan 2010 | B2 |
7674184 | Briggs et al. | Mar 2010 | B2 |
7720718 | Hale et al. | May 2010 | B2 |
7739925 | Foster | Jun 2010 | B2 |
7749089 | Briggs et al. | Jul 2010 | B1 |
7752794 | Kerlin | Jul 2010 | B2 |
7775894 | Henry et al. | Aug 2010 | B2 |
7786871 | Schwarze et al. | Aug 2010 | B2 |
7791557 | Mickle et al. | Sep 2010 | B2 |
7802724 | Nohr | Sep 2010 | B1 |
7812779 | Turner et al. | Oct 2010 | B2 |
7817044 | Posamentier | Oct 2010 | B2 |
7837567 | Holzberg et al. | Nov 2010 | B2 |
7850527 | Barney et al. | Dec 2010 | B2 |
7855697 | Chamarti et al. | Dec 2010 | B2 |
7878905 | Weston et al. | Feb 2011 | B2 |
7881713 | Hale et al. | Feb 2011 | B2 |
7885763 | Havens | Feb 2011 | B2 |
7896742 | Weston et al. | Mar 2011 | B2 |
7925308 | Greene et al. | Apr 2011 | B2 |
7942320 | Joe | May 2011 | B2 |
7956725 | Smith | Jun 2011 | B2 |
7994910 | Brooks et al. | Aug 2011 | B2 |
7997981 | Rowe et al. | Aug 2011 | B2 |
8016667 | Benbrahim | Sep 2011 | B2 |
8035335 | Duron et al. | Oct 2011 | B2 |
8082165 | Natsuyama et al. | Dec 2011 | B2 |
8085130 | Liu et al. | Dec 2011 | B2 |
8089458 | Barney et al. | Jan 2012 | B2 |
8123613 | Dabrowski | Feb 2012 | B2 |
8164567 | Barney et al. | Apr 2012 | B1 |
8169406 | Barney et al. | May 2012 | B2 |
8184097 | Barney et al. | May 2012 | B1 |
8200515 | Natsuyama et al. | Jun 2012 | B2 |
8213862 | Muth | Jul 2012 | B2 |
8222996 | Smith et al. | Jul 2012 | B2 |
8226493 | Briggs et al. | Jul 2012 | B2 |
8231047 | Canora | Jul 2012 | B2 |
8237561 | Beigel et al. | Aug 2012 | B2 |
8248208 | Renfro, Jr. | Aug 2012 | B2 |
8248367 | Barney et al. | Aug 2012 | B1 |
8253533 | Jones | Aug 2012 | B2 |
8253542 | Canora et al. | Aug 2012 | B2 |
8296983 | Padgett et al. | Oct 2012 | B2 |
8313381 | Ackley et al. | Nov 2012 | B2 |
8330284 | Weston et al. | Dec 2012 | B2 |
8330587 | Kupstas | Dec 2012 | B2 |
8342929 | Briggs et al. | Jan 2013 | B2 |
8353705 | Dobson et al. | Jan 2013 | B2 |
8368648 | Barney et al. | Feb 2013 | B2 |
8373543 | Brommer et al. | Feb 2013 | B2 |
8373659 | Barney et al. | Feb 2013 | B2 |
8384668 | Barney et al. | Feb 2013 | B2 |
8392506 | Rowe et al. | Mar 2013 | B2 |
8416087 | Canora et al. | Apr 2013 | B2 |
8425313 | Nelson et al. | Apr 2013 | B2 |
8430749 | Nelson et al. | Apr 2013 | B2 |
8463183 | Muth | Jun 2013 | B2 |
8475275 | Weston et al. | Jul 2013 | B2 |
8477046 | Alonso | Jul 2013 | B2 |
8489657 | Shepherd et al. | Jul 2013 | B2 |
8491389 | Weston et al. | Jul 2013 | B2 |
8531050 | Barney et al. | Sep 2013 | B2 |
8552597 | Song et al. | Oct 2013 | B2 |
8564414 | Bergevoet | Oct 2013 | B2 |
8571905 | Risnoveanu et al. | Oct 2013 | B2 |
8581721 | Asher et al. | Nov 2013 | B2 |
8593283 | Smith | Nov 2013 | B2 |
8593291 | Townsend et al. | Nov 2013 | B2 |
8597111 | LeMay et al. | Dec 2013 | B2 |
8608535 | Weston et al. | Dec 2013 | B2 |
8618928 | Weed et al. | Dec 2013 | B2 |
8621245 | Shearer et al. | Dec 2013 | B2 |
8635126 | Risnoveanu et al. | Jan 2014 | B2 |
8681000 | August et al. | Mar 2014 | B2 |
8682729 | Werbitt | Mar 2014 | B2 |
8686579 | Barney et al. | Apr 2014 | B2 |
8702515 | Weston et al. | Apr 2014 | B2 |
8708821 | Barney et al. | Apr 2014 | B2 |
8711094 | Barney et al. | Apr 2014 | B2 |
8742623 | Biederman et al. | Jun 2014 | B1 |
8753165 | Weston | Jun 2014 | B2 |
8758136 | Briggs et al. | Jun 2014 | B2 |
8773245 | Canora et al. | Jul 2014 | B2 |
8790180 | Barney et al. | Jul 2014 | B2 |
8797146 | Cook et al. | Aug 2014 | B2 |
8810373 | Kim et al. | Aug 2014 | B2 |
8810430 | Proud | Aug 2014 | B2 |
8814688 | Barney et al. | Aug 2014 | B2 |
8816873 | Bisset et al. | Aug 2014 | B2 |
8821238 | Ackley et al. | Sep 2014 | B2 |
8827810 | Weston et al. | Sep 2014 | B2 |
8830030 | Arthurs et al. | Sep 2014 | B2 |
8851372 | Zhou et al. | Oct 2014 | B2 |
8866673 | Mendelson | Oct 2014 | B2 |
8870641 | Dabrowski | Oct 2014 | B2 |
8888576 | Briggs et al. | Nov 2014 | B2 |
8913011 | Barney et al. | Dec 2014 | B2 |
8915785 | Barney et al. | Dec 2014 | B2 |
8917172 | Charych | Dec 2014 | B2 |
8923994 | Laikari et al. | Dec 2014 | B2 |
8924432 | Richards et al. | Dec 2014 | B2 |
8937530 | Smith et al. | Jan 2015 | B2 |
8961260 | Weston | Feb 2015 | B2 |
8961312 | Barney et al. | Feb 2015 | B2 |
8971804 | Butler | Mar 2015 | B2 |
8972048 | Canora et al. | Mar 2015 | B2 |
9002264 | Zhang | Apr 2015 | B2 |
9021277 | Shearer et al. | Apr 2015 | B2 |
9039533 | Barney et al. | May 2015 | B2 |
9072965 | Kessman et al. | Jul 2015 | B2 |
9087246 | Chin et al. | Jul 2015 | B1 |
9109763 | Wein | Aug 2015 | B1 |
9122964 | Krawczewicz | Sep 2015 | B2 |
9129153 | Ianni et al. | Sep 2015 | B2 |
9130651 | Tabe | Sep 2015 | B2 |
9138650 | Barney et al. | Sep 2015 | B2 |
9149717 | Barney et al. | Oct 2015 | B2 |
9159151 | Perez et al. | Oct 2015 | B2 |
9162148 | Barney et al. | Oct 2015 | B2 |
9162149 | Weston et al. | Oct 2015 | B2 |
9178569 | Chakravarty et al. | Nov 2015 | B2 |
9183676 | McCulloch et al. | Nov 2015 | B2 |
9186585 | Briggs et al. | Nov 2015 | B2 |
9196964 | Baringer | Nov 2015 | B2 |
9207650 | Narendra et al. | Dec 2015 | B2 |
9215592 | Narendra et al. | Dec 2015 | B2 |
9225372 | Butler | Dec 2015 | B2 |
9229530 | Wu et al. | Jan 2016 | B1 |
9232475 | Heinzelman et al. | Jan 2016 | B2 |
9245158 | Gudan et al. | Jan 2016 | B2 |
9272206 | Weston et al. | Mar 2016 | B2 |
9318898 | John | Apr 2016 | B2 |
9320976 | Weston | Apr 2016 | B2 |
9367852 | Canora et al. | Jun 2016 | B2 |
9377857 | Geisner et al. | Jun 2016 | B2 |
9383730 | Prestenback | Jul 2016 | B2 |
9393491 | Barney et al. | Jul 2016 | B2 |
9393500 | Barney et al. | Jul 2016 | B2 |
9411992 | Marek et al. | Aug 2016 | B1 |
9412231 | Dabrowski | Aug 2016 | B2 |
9413229 | Fleming | Aug 2016 | B2 |
9424451 | Kalhous et al. | Aug 2016 | B2 |
9438044 | Proud | Sep 2016 | B2 |
9443382 | Lyons | Sep 2016 | B2 |
9446319 | Barney et al. | Sep 2016 | B2 |
9463380 | Weston et al. | Oct 2016 | B2 |
9468854 | Briggs et al. | Oct 2016 | B2 |
9474962 | Barney et al. | Oct 2016 | B2 |
9480929 | Weston | Nov 2016 | B2 |
9483906 | LeMay et al. | Nov 2016 | B2 |
9491584 | Mendelson | Nov 2016 | B1 |
9523775 | Chakraborty et al. | Dec 2016 | B2 |
9542579 | Mangold et al. | Jan 2017 | B2 |
9563898 | McMahan et al. | Feb 2017 | B2 |
9579568 | Barney et al. | Feb 2017 | B2 |
9582981 | Rokhsaz et al. | Feb 2017 | B2 |
9589224 | Patterson et al. | Mar 2017 | B2 |
9613237 | Nikunen et al. | Apr 2017 | B2 |
9616334 | Weston et al. | Apr 2017 | B2 |
9626672 | Fisher | Apr 2017 | B2 |
9642089 | Sharma et al. | May 2017 | B2 |
9646312 | Lyons et al. | May 2017 | B2 |
9651992 | Stotler | May 2017 | B2 |
9661450 | Agrawal et al. | May 2017 | B2 |
9675878 | Barney et al. | Jun 2017 | B2 |
9680533 | Gudan et al. | Jun 2017 | B2 |
9692230 | Biederman et al. | Jun 2017 | B2 |
9696802 | Priyantha et al. | Jul 2017 | B2 |
9706924 | Greene | Jul 2017 | B2 |
9707478 | Barney et al. | Jul 2017 | B2 |
9713766 | Barney et al. | Jul 2017 | B2 |
9731194 | Briggs et al. | Aug 2017 | B2 |
9737797 | Barney et al. | Aug 2017 | B2 |
9741022 | Ziskind et al. | Aug 2017 | B2 |
9743357 | Tabe | Aug 2017 | B2 |
9747538 | Gudan et al. | Aug 2017 | B2 |
9748632 | Rokhsaz et al. | Aug 2017 | B2 |
9754139 | Chemishkian et al. | Sep 2017 | B2 |
9754202 | Gudan et al. | Sep 2017 | B2 |
9756579 | Zhou et al. | Sep 2017 | B2 |
9762292 | Manian et al. | Sep 2017 | B2 |
9767649 | Dabrowski | Sep 2017 | B2 |
9770652 | Barney et al. | Sep 2017 | B2 |
9813855 | Sahadi et al. | Nov 2017 | B2 |
9814973 | Barney et al. | Nov 2017 | B2 |
9821224 | Latta et al. | Nov 2017 | B2 |
9831724 | Copeland et al. | Nov 2017 | B2 |
9836103 | Kramer et al. | Dec 2017 | B2 |
9837865 | Mitcheson et al. | Dec 2017 | B2 |
9861887 | Briggs et al. | Jan 2018 | B1 |
9864882 | Geist et al. | Jan 2018 | B1 |
9867024 | Larson | Jan 2018 | B1 |
9871298 | Daniel et al. | Jan 2018 | B2 |
9909896 | Bass et al. | Mar 2018 | B2 |
9928527 | Woycik et al. | Mar 2018 | B2 |
9928681 | LeMay, Jr. et al. | Mar 2018 | B2 |
9931578 | Weston | Apr 2018 | B2 |
9936357 | Mills et al. | Apr 2018 | B2 |
9949219 | Belogolovy | Apr 2018 | B2 |
9972894 | Dion et al. | May 2018 | B2 |
9993724 | Barney et al. | Jun 2018 | B2 |
1001079 | Weston et al. | Jul 2018 | A1 |
1002262 | Barney et al. | Jul 2018 | A1 |
10241565 | Yamaguchi | Mar 2019 | B2 |
20030132951 | Sorokin et al. | Jul 2003 | A1 |
20060033713 | Pryor | Feb 2006 | A1 |
20100199228 | Latta | Aug 2010 | A1 |
20100203973 | Muth | Aug 2010 | A1 |
20100278393 | Snook et al. | Nov 2010 | A1 |
20100302138 | Poot et al. | Dec 2010 | A1 |
20100306685 | Giaimo, III et al. | Dec 2010 | A1 |
20110254837 | Kang et al. | Oct 2011 | A1 |
20120286938 | Cote et al. | Nov 2012 | A1 |
20120309520 | Evertt et al. | Dec 2012 | A1 |
20130142384 | Ofek | Jun 2013 | A1 |
20130215112 | Ho et al. | Aug 2013 | A1 |
20130324059 | Lee et al. | Dec 2013 | A1 |
20140094307 | Doolittle | Apr 2014 | A1 |
20140122170 | Padgett et al. | May 2014 | A1 |
20140162693 | Wachter et al. | Jun 2014 | A1 |
20140371885 | Ianni et al. | Dec 2014 | A1 |
20150046202 | Hunt | Feb 2015 | A1 |
20150078140 | Riobo Aboy et al. | Mar 2015 | A1 |
20150138556 | LeBoeuf et al. | May 2015 | A1 |
20150194817 | Lee et al. | Jul 2015 | A1 |
20150236551 | Shearer et al. | Aug 2015 | A1 |
20150244976 | Chen et al. | Aug 2015 | A1 |
20150255226 | Rouvala et al. | Sep 2015 | A1 |
20150312517 | Hoyt et al. | Oct 2015 | A1 |
20150336013 | Stenzler et al. | Nov 2015 | A1 |
20150371194 | Marshall et al. | Dec 2015 | A1 |
20160019423 | Ortiz et al. | Jan 2016 | A1 |
20160020636 | Khlat | Jan 2016 | A1 |
20160020637 | Khlat | Jan 2016 | A1 |
20160067600 | Barney et al. | Mar 2016 | A1 |
20160129335 | Domansky et al. | May 2016 | A1 |
20160144280 | Pawlowski et al. | May 2016 | A1 |
20160170998 | Frank et al. | Jun 2016 | A1 |
20160182165 | Margon et al. | Jun 2016 | A1 |
20160203663 | Proctor | Jul 2016 | A1 |
20160217496 | Tuchman et al. | Jul 2016 | A1 |
20160226610 | Pinzon Gonzales, Jr. | Aug 2016 | A1 |
20160275722 | Bretschneider et al. | Sep 2016 | A1 |
20160307398 | Walker et al. | Oct 2016 | A1 |
20160321548 | Ziskind et al. | Nov 2016 | A1 |
20160350973 | Shapira | Dec 2016 | A1 |
20160373522 | Carlos et al. | Dec 2016 | A1 |
20170091850 | Alvarez et al. | Mar 2017 | A1 |
20170093463 | Wang et al. | Mar 2017 | A1 |
20170115018 | Mintz | Apr 2017 | A1 |
20170132438 | Cletheroe et al. | May 2017 | A1 |
20170162006 | Sahadi et al. | Jun 2017 | A1 |
20170169449 | Heaven et al. | Jun 2017 | A1 |
20170186270 | Acres | Jun 2017 | A1 |
20170201003 | Ackley et al. | Jul 2017 | A1 |
20170225069 | Goslin et al. | Aug 2017 | A1 |
20170228804 | Soni et al. | Aug 2017 | A1 |
20170235369 | Acer et al. | Aug 2017 | A1 |
20170237466 | Carr | Aug 2017 | A1 |
20170270734 | Geraghty et al. | Sep 2017 | A1 |
20170288735 | Zhou et al. | Oct 2017 | A1 |
20170293985 | Deria et al. | Oct 2017 | A1 |
20170331509 | Gollakota et al. | Nov 2017 | A1 |
20170340961 | Weston et al. | Nov 2017 | A1 |
20170348593 | Barney et al. | Dec 2017 | A1 |
20170358957 | Mitcheson et al. | Dec 2017 | A1 |
20170361236 | Barney et al. | Dec 2017 | A1 |
20170373526 | Huang et al. | Dec 2017 | A1 |
20180008897 | Ackley et al. | Jan 2018 | A1 |
20180014385 | Wein | Jan 2018 | A1 |
20180078853 | Barney et al. | Mar 2018 | A1 |
20180214769 | Briggs et al. | Aug 2018 | A1 |
20180318723 | Weston | Nov 2018 | A1 |
20180339226 | Barney et al. | Nov 2018 | A1 |
20180365887 | Wang | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
2003288472 | Oct 2003 | JP |
2004126791 | Apr 2004 | JP |
2005267179 | Sep 2005 | JP |
2010000178 | Jan 2010 | JP |
2012244846 | Dec 2012 | JP |
2013188019 | Sep 2013 | JP |
6152919 | Jun 2017 | JP |
Entry |
---|
Clements, Ryan; “Kinect Adventures Review—IGN”, Nov. 3, 2010; URL:https://www.ign.com/articles/2010/11/04/kinect-adventures-review; pp. 6-8. |
Mirabella, Fran; “Your Shape: Fitness Evolved Review—IGN”, Nov. 8, 2010; URL:https://www.ign.com/articles/2010/11/09/your-shape-fitness-evolved-review; pp. 6-8. |
PCT/US2018/059465 Invitation to Pay Additional Fees dated Jan. 23, 2019. |
PCT/US2018/059470 Invitation to Pay Additional Fees dated Jan. 24, 2019. |
U.S. Appl. No. 15/861,502, filed Jan. 3, 2018, Wei Cheng Yeh. |
U.S. Appl. No. 15/874,671, filed Jan. 18, 2018, Wei Cheng Yeh. |
U.S. Appl. No. 15/882,761, filed Jan. 29, 2018, Wei Cheng Yeh. |
U.S. Appl. No. 15/882,721, filed Jan. 29, 2018, Wei Cheng Yeh. |
U.S. Appl. No. 15/882,788, filed Jan. 29, 2018, Wei Cheng Yeh. |
U.S. Appl. No. 15/882,738, filed Jan. 29, 2018, Travis Jon Cossairt. |
U.S. Appl. No. 15/972,940, filed May 7, 2018, Unavailable. |
U.S. Appl. No. 15/995,633, filed Jun. 1, 2018, Unavailable. |
U.S. Appl. No. 16/196,967, filed Nov. 20, 2018, Matthew Usi. |
U.S. Appl. No. 15/826,357, filed Nov. 29, 2017, Wei Yeh. |
Number | Date | Country | |
---|---|---|---|
20190168120 A1 | Jun 2019 | US |