The game of basketball is a competitive sport immensely popular worldwide. It is played professionally, collegiately, in high schools, middle schools and elementary schools, and among friends and family as well as individually. The game of basketball may be played in numerous ways, and, for example, may take the form of an organized game between teams, a pickup game at a local park, or a game of HORSE in one's driveway.
Athletes often spend numerous hours practicing and training in order to improve their skill level so they can become more competitive in the game of basketball. In an effort to assist athletes in improving their skill level, systems have been developed that track an athlete's performance while training or playing, and then provide feedback to the athlete indicative of the performance which can be analyzed and evaluated to help the athlete improve one's skill level.
Game systems have a different focus which is primarily directed toward enhancing the recreational aspect of basketball. Such game systems may involve shooting competitions between family members and/or friends at one location.
In one aspect of the invention, a basketball shooting game method for at least two basketball players on basketball courts at remote locations is provided. The method includes the steps of matching up two players at remote locations that have selected to play the same basketball shooting game, acquiring real-time camera data from smart glasses that each player wears while the players are playing the selected basketball shooting game, analyzing the camera data to determine make/miss data for each player based upon how many shots each player made and missed during the selected basketball game and communicating the camera data and the make/miss data of one player to the other player.
In another aspect of the invention, a method for enabling two basketball players on basketball courts at remote locations to play a selected basketball game against one another is provided. The method includes the steps of matching up two basketball players on basketball courts at remote locations to play a selected basketball shooting same, transmitting the real-time camera data from smart glasses worn by each player to a smart device of the other player, analyzing the data from each player's smart glasses to determine how many shots each player made and missed during the selected basketball game and transmitting the number of shots made and missed by each player to the other player's smart device.
In another aspect of the invention, a method for enabling two basketball players on basketball courts at remote locations to play a selected basketball shooting game against one another is provided. The method includes the steps of matching up two players to play a selected basketball shooting same, transmitting the real-time camera data from smart glasses worn by each player to a smart device of the other player, analyzing the data from each player's smart glasses to recognize the basketball rim and backboard, analyzing the data from each player's smart glasses to determine each player's position on their basketball court, analyzing the data from each player's smart glasses to track the basketball, analyzing the data from each player's smart glasses to determine if each shot taken by each player was a make or miss without the need for sensors on the basketball, rim or backboard, displaying augmented reality information related to the selected basketball game on each player's smart glasses and transmitting the number of shots made and missed by each player to both players smart devices.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of constructions and the arrangement of components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
The smart glasses 12 worn by each player 18 and 20 include a processor 30 and a camera 32, as is shown in
An example of a basketball shooting game utilizing the invention is as follows and as shown in
The images, video as well as other data collected and processed by player's glasses 12 is communicated to the other player such as be being wirelessly transmitted to the player's respective device 16 which is also communicated to the other player's device 16 as is known in the art. With this system 10, the capture of images, video and data by the glasses 12 of the first player 18 at one basketball court location can be viewed in real-time by the second player 18 at a remote basketball court location and vice versa thus enabling the play of basketball games by one player 18 against another player 20 anywhere in the world.
After the selected game is completed at step 52, each player 18 and 20 can go back to their device 16 at steps 54 and 56 to view their own scores and data, to view the other player's scores and data and to review which player 18 or 20 won the game at step 58. Each player 18 and 20 can then set up a rematch, proceed to another opponent or game or quit at step 60.
It should be noted that in the above example, two players 18 and 20 are matched up and playing the selected game against each other. However, the invention can be scaled so that any number of players can play against one another if each player has smart glasses and a device to access the application.
In another example of the system 10, indicia and/or graphics 62 can be projected by the glasses 12, as is shown in
Turning now to
The glasses 12 include right and left lenses 64 on a wearable frame 66. The lenses 64 allow natural light in thus allowing the wearer to see the real world. The field of view (FOV) of most lenses 64 is preferably in the range of 30 degrees to 180 degrees. A positioning device 68 is mounted on the frame 66 and functions to determine a physical location of the wearer. The positioning device 68 may include both a geographic locator such as a GPS device known in the art that utilizes signals from GPS satellites to determine the current position of a player 18 or 20 as well as orientation devices such as known accelerometers and 3-axis gravity detectors that determine the direction the player 18 or 20 is looking while wearing the glasses 12. The camera 32 is mounted to the frame 66 and captures images and video in the FOV. A display 70 is mounted on the frame 66 spaced in front of one or both lenses 64. The display 70 is preferably a small video screen such as a micro-LED display that the player 18 or 20 looks directly at or is a projection device that displays information such as images, video, and other data onto one of both lenses 64. For example, every time a player 18 or 20 makes a shot, a green MAKE will show on the display 70 in real-time, and every time the player 18 or 20 misses a shot, a red MISS will show on the display 70 in real time. The display 70 functions to overlay such information and/or graphics onto a user's FOV such that what a player 18 or 20 is seeing through the lenses 64 is augmented with the overlaid information 62 thus providing an augmented reality situation. The processor 30 is mounted to the frame 66 for analyzing and processing the data from the camera 32 such as by using computer vision (CV) techniques and communicating that data to each player's device 16.
The glasses 12 preferably include a built-in microphone/speaker 72 and earphone or earpiece 74. The microphone/speaker 72 permits a player 18 or 20 to speak to the other player in real time while playing the basketball game, and earpiece 74 enables the players 18 and 20 to hear what the other players are saying in response. The glasses 12 include a conventional standard transceiver 76 in order to wirelessly communicate information such as the camera images and/or other data relating to a player 18 or 20 and communicate through Bluetooth Low energy (BLE). The smart glasses 12 have a server manager code that contains functions for several different server file protocols including TCP, UDP, MIX and TLA. The file transfer functions preferably use multithreading to transfer and process the file bytes quickly and efficiently. The glasses 12 are powered such as by battery and/or electric charge.
As noted above, commercially available smart glasses can be utilized as part of the invention or, alternatively, specifically designed custom-made glasses can be designed and manufactured such as with more than one camera and faster processing speeds.
Turning now to
The interface 78 enables a player 18 or 20 to choose between several basketball games 96 by selecting from a predetermined menu. Games can include but are not limited to a three-point shooting game and a free throw shooting game. A player 18 or 20 can also choose to play the game with a different player.
In order to utilize the smart glasses 12 for the basketball shooting game between two players 18 and 20 on remote basketball courts, processes are needed to track the players 18 and 20 and analyze the data from the glasses 12 by tracking the basketball 22, without need for any sensors on the rim 24, backboard 26 or basketball 22 which would be an impediment to playing the game.
Turning now to
With respect to tracking the basketball 22, the data streamed and acquired from the camera 32 of the smart glasses 12 is utilized as shown in step 104 to identify the basketball 22 in step 110, track the basketball 22 in step 112 and determine made and missed shots in step 114. Preferably, a tool that the system 10 uses to interpret the data streamed from the camera 32 can be sourced from OpenCV, an open source CV and machine learning software library. More specifically, an implementation of the OpenCV library made solely for use in Unity OpenCV for Unity from Enox Software can be used for tracking the basketball. Unity is advantageous to enable almost all commercially available smart glasses to be used in this game system. Filters can also be used to filter artificial data from the camera data. However, other software for processing the data from the smart glasses, whether commercially available or custom written, can also be utilized with the invention.
HoloLens World Tracking can be used to identify the basketball backboard 26 in 3d space in step 116. To accomplish this, images of backboards are preferably scanned to enable the glasses 12 to recognize backboards of different designs. Alternatively, after putting the glasses 12 on, a player 18 or 20 will be asked to shoot any number of shots, such as 3-5, until the system 10 recognizes the backboard 26 in use. If desired, the entire court could be identified by the system 10 as well.
The OpenCV plugin can be used to identify and track the basketball 22 and process the camera data while tracking the backboard's position. During the processing, the OpenCV plugin functions will analyze the frames captured from the camera 32 to identify contour lines that match the circular shape of the basketball 22 as the images featured from the frame are 2d. The processing also takes into account the size of the basketball 22, the focal length of the camera 32, the diameter of the basketball 22 and the color of the basketball 22 such as through use of the OpenCV Find Contours function. Alternatively, the system 10 can request a player 18 or 20 to place the basketball 22 in front of the smart glasses 12 until the system 10 recognizes the specific basketball that is being used in the game.
The image processing within OpenCV on the processor 30 is done in as many frames as possible in order to achieve good accuracy without slowing down the processing. In order to achieve better performance, the processing can scale the input frames down to 896×504 resolution (16:9 aspect ratio). The FOV can be kept at 48 degrees and the aspect ratio is kept the same to ensure consistent accuracy. However, it should be noted that faster processing speed, higher resolution and different aspect ratios can also be utilized as known by those of skill in the art. It should also be noted that a different processor not on the smart glasses 12 can also be used in addition to or in place of the processor 30.
The processing of the camera data also preferably detects the color of the basketball 22 and converts from RGB to HSV in order to precisely determine the color of the basketball 22 when it is in the camera frame data. Filtering is preferably done so appropriately colored round orange objects such as 0.5 miles away are not identified as the basketball 22.
Turning now to the processing of basketball shot makes/misses and
It should also be noted that the system 10 can also provide predictive make/miss determinations if a player 18 or 20 turns their head away from the rim 24 thus not providing any camera data. In this situation, the system 10 can predict if the shot was a make/miss based on the trajectory mapping of the shot as it approaches the rim 24 with the camera data that is available.
With respect to projecting graphics and game data 62 on the display 70 of the smart glasses 12, this is known in the art through use of a combiner that works like a partial mirror to redirect the display light and selectively let in light from the real world.
In another aspect of the invention, the camera output of the glasses 12 of the players 18 and 20 can be transmitted via a network to third parties to watch the games whether they are physically at one of the basketball courts or not.
Various features and advantages of the invention are set forth in the following claims.
This application claims the benefit of U.S. Provisional Application No. 62/729,532 filed on Sep. 11, 2018.
Number | Name | Date | Kind |
---|---|---|---|
6805490 | Levola | Oct 2004 | B2 |
7094164 | Marty et al. | Aug 2006 | B2 |
7181108 | Levola | Feb 2007 | B2 |
7184615 | Levola | Feb 2007 | B2 |
7206107 | Levola | Apr 2007 | B2 |
7483604 | Levola | Jan 2009 | B2 |
7764413 | Levola | Jul 2010 | B2 |
7850552 | Marty et al. | Dec 2010 | B2 |
7854669 | Marty et al. | Dec 2010 | B2 |
8160411 | Levola et al. | Apr 2012 | B2 |
8194325 | Levola et al. | Jun 2012 | B2 |
8254031 | Levola | Aug 2012 | B2 |
8314993 | Levola | Nov 2012 | B2 |
8320032 | Levola | Nov 2012 | B2 |
8360578 | Nummela | Jan 2013 | B2 |
8408982 | Marty et al. | Apr 2013 | B2 |
8409024 | Marty et al. | Apr 2013 | B2 |
8466953 | Levola | Jun 2013 | B2 |
8477046 | Alonso | Jul 2013 | B2 |
8494229 | Jarvenpaa et al. | Jul 2013 | B2 |
8508848 | Saarikko | Aug 2013 | B2 |
8547638 | Levola | Oct 2013 | B2 |
8593734 | Laakkonen | Nov 2013 | B2 |
8617008 | Marty et al. | Dec 2013 | B2 |
8622832 | Marty et al. | Jan 2014 | B2 |
8717392 | Levola | May 2014 | B2 |
8774467 | Ryan | Jul 2014 | B2 |
8830584 | Saarikko et al. | Sep 2014 | B2 |
8908922 | Marty et al. | Dec 2014 | B2 |
8913324 | Schrader | Dec 2014 | B2 |
8948457 | Marty et al. | Feb 2015 | B2 |
8950867 | Macnamara | Feb 2015 | B2 |
9086568 | Jarvenpaa | Jul 2015 | B2 |
9215293 | Miller | Dec 2015 | B2 |
9238165 | Marty et al. | Jan 2016 | B2 |
9283431 | Marty et al. | Mar 2016 | B2 |
9283432 | Marty et al. | Mar 2016 | B2 |
9310559 | Macnamara | Apr 2016 | B2 |
9345929 | Marty et al. | May 2016 | B2 |
9348143 | Gao et al. | May 2016 | B2 |
D758367 | Natsume | Jun 2016 | S |
9358455 | Marty et al. | Jun 2016 | B2 |
9370704 | Marty | Jun 2016 | B2 |
9389424 | Schowengerdt | Jul 2016 | B1 |
9390501 | Marty et al. | Jul 2016 | B2 |
9417452 | Schowengerdt et al. | Aug 2016 | B2 |
9429752 | Schowengerdt et al. | Aug 2016 | B2 |
9470906 | Kaji et al. | Oct 2016 | B2 |
9541383 | Abovitz et al. | Jan 2017 | B2 |
9547174 | Gao et al. | Jan 2017 | B2 |
9612403 | Abovitz et al. | Apr 2017 | B2 |
9651368 | Abovitz et al. | May 2017 | B2 |
9671566 | Abovitz et al. | Jun 2017 | B2 |
9694238 | Marty et al. | Jul 2017 | B2 |
9697617 | Marty et al. | Jul 2017 | B2 |
D795952 | Natsume | Aug 2017 | S |
9726893 | Gao et al. | Aug 2017 | B2 |
9734405 | Marty et al. | Aug 2017 | B2 |
9740006 | Gao | Aug 2017 | B2 |
D796503 | Natsume et al. | Sep 2017 | S |
D796504 | Natsume et al. | Sep 2017 | S |
D796505 | Natsume et al. | Sep 2017 | S |
D796506 | Natsume et al. | Sep 2017 | S |
D797735 | Fraser et al. | Sep 2017 | S |
D797743 | Awad et al. | Sep 2017 | S |
D797749 | Awad et al. | Sep 2017 | S |
9753286 | Gao et al. | Sep 2017 | B2 |
9753297 | Saarikko et al. | Sep 2017 | B2 |
9761055 | Miller | Sep 2017 | B2 |
9766381 | Jarvenpaa et al. | Sep 2017 | B2 |
9766703 | Miller | Sep 2017 | B2 |
9767616 | Miller | Sep 2017 | B2 |
9791700 | Schowengerdt | Oct 2017 | B2 |
9804397 | Schowengerdt et al. | Oct 2017 | B2 |
9830522 | Mazur et al. | Nov 2017 | B2 |
9832437 | Kass et al. | Nov 2017 | B2 |
9841601 | Schowengerdt | Dec 2017 | B2 |
9844704 | Thurman et al. | Dec 2017 | B2 |
9846306 | Schowengerdt | Dec 2017 | B2 |
9846967 | Schowengerdt | Dec 2017 | B2 |
9851563 | Gao et al. | Dec 2017 | B2 |
9852548 | Greco et al. | Dec 2017 | B2 |
9857170 | Abovitz et al. | Jan 2018 | B2 |
9857591 | Welch et al. | Jan 2018 | B2 |
9874749 | Bradski et al. | Jan 2018 | B2 |
9874752 | Gao et al. | Jan 2018 | B2 |
9881420 | Miller | Jan 2018 | B2 |
9886624 | Marty et al. | Feb 2018 | B1 |
9889367 | Minkovitch | Feb 2018 | B2 |
9891077 | Kaehler | Feb 2018 | B2 |
9904058 | Yeoh et al. | Feb 2018 | B2 |
9911233 | O'Connor et al. | Mar 2018 | B2 |
9911234 | Miller | Mar 2018 | B2 |
9915824 | Schowengerdt et al. | Mar 2018 | B2 |
9915826 | Tekolste et al. | Mar 2018 | B2 |
9922462 | Miller | Mar 2018 | B2 |
9928654 | Miller | Mar 2018 | B2 |
9939643 | Schowengerdt | Apr 2018 | B2 |
9946071 | Schowengerdt | Apr 2018 | B2 |
9948874 | Kaehler | Apr 2018 | B2 |
9952042 | Abovitz et al. | Apr 2018 | B2 |
9972132 | O'Connor et al. | May 2018 | B2 |
9978182 | Yeoh et al. | May 2018 | B2 |
9984506 | Miller | May 2018 | B2 |
9990777 | Bradski | Jun 2018 | B2 |
9996977 | O'Connor et al. | Jun 2018 | B2 |
10008038 | Miller | Jun 2018 | B2 |
10010778 | Marty et al. | Jul 2018 | B2 |
10013806 | O'Connor et al. | Jul 2018 | B2 |
10015477 | Grata et al. | Jul 2018 | B2 |
10021149 | Miller | Jul 2018 | B2 |
10042097 | Tekolste | Aug 2018 | B2 |
10042166 | Yeoh et al. | Aug 2018 | B2 |
10043312 | Miller et al. | Aug 2018 | B2 |
10048501 | Gao et al. | Aug 2018 | B2 |
10060766 | Kaehler | Aug 2018 | B2 |
10061130 | Gao et al. | Aug 2018 | B2 |
10068374 | Miller et al. | Sep 2018 | B2 |
10073267 | Tekolste et al. | Sep 2018 | B2 |
10073272 | Kaji et al. | Sep 2018 | B2 |
10078919 | Powderly et al. | Sep 2018 | B2 |
10089453 | Kaehler | Oct 2018 | B2 |
10089526 | Amayeh et al. | Oct 2018 | B2 |
10092793 | Marty et al. | Oct 2018 | B1 |
10100154 | Bhagat | Oct 2018 | B2 |
10101802 | Abovitz | Oct 2018 | B2 |
10109061 | Bose et al. | Oct 2018 | B2 |
10109108 | Miller et al. | Oct 2018 | B2 |
10115232 | Miller et al. | Oct 2018 | B2 |
10115233 | Miller et al. | Oct 2018 | B2 |
10126812 | Miller et al. | Nov 2018 | B2 |
10127369 | Kaehler | Nov 2018 | B2 |
10127723 | Miller | Nov 2018 | B2 |
10134186 | Schowengerdt et al. | Nov 2018 | B2 |
D836105 | Natsume et al. | Dec 2018 | S |
D836106 | Natsume et al. | Dec 2018 | S |
D836107 | Natsume et al. | Dec 2018 | S |
D836108 | Natsume et al. | Dec 2018 | S |
D836109 | Natsume et al. | Dec 2018 | S |
D836630 | Natsume et al. | Dec 2018 | S |
D836631 | Natsume et al. | Dec 2018 | S |
D836632 | Natsume et al. | Dec 2018 | S |
D836633 | Natsume et al. | Dec 2018 | S |
D836634 | Natsume et al. | Dec 2018 | S |
10146997 | Kaehler | Dec 2018 | B2 |
10151875 | Schowengerdt et al. | Dec 2018 | B2 |
10156722 | Gao et al. | Dec 2018 | B2 |
10156725 | TeKolste et al. | Dec 2018 | B2 |
10162184 | Gao et al. | Dec 2018 | B2 |
10163010 | Kaehler et al. | Dec 2018 | B2 |
10163011 | Kaehler et al. | Dec 2018 | B2 |
10163265 | Miller et al. | Dec 2018 | B2 |
10175478 | Tekolste et al. | Jan 2019 | B2 |
10175491 | Gao et al. | Jan 2019 | B2 |
10175564 | Yaras | Jan 2019 | B2 |
10176639 | Schowengerdt et al. | Jan 2019 | B2 |
10180734 | Miller et al. | Jan 2019 | B2 |
10185147 | Lewis | Jan 2019 | B2 |
10186082 | Min et al. | Jan 2019 | B2 |
10186085 | Greco et al. | Jan 2019 | B2 |
10191294 | Macnamara | Jan 2019 | B2 |
10198864 | Miller | Feb 2019 | B2 |
10203762 | Bradski et al. | Feb 2019 | B2 |
10210471 | King et al. | Feb 2019 | B2 |
10228242 | Abovitz et al. | Mar 2019 | B2 |
10234687 | Welch et al. | Mar 2019 | B2 |
10234939 | Miller et al. | Mar 2019 | B2 |
10237540 | Welch et al. | Mar 2019 | B2 |
10241263 | Schowengerdt et al. | Mar 2019 | B2 |
D845296 | Natsume et al. | Apr 2019 | S |
D845297 | Natsume et al. | Apr 2019 | S |
10249087 | Wei et al. | Apr 2019 | B2 |
10254454 | Wei et al. | Apr 2019 | B2 |
10254483 | Schowengerdt et al. | Apr 2019 | B2 |
10254536 | Yeoh et al. | Apr 2019 | B2 |
10255529 | Rabinovich et al. | Apr 2019 | B2 |
10260864 | Edwin et al. | Apr 2019 | B2 |
10261162 | Bucknor et al. | Apr 2019 | B2 |
10261318 | TeKolste et al. | Apr 2019 | B2 |
10262462 | Miller et al. | Apr 2019 | B2 |
10267970 | Jones, Jr. et al. | Apr 2019 | B2 |
10275902 | Bradski | Apr 2019 | B2 |
10282611 | Amayeh et al. | May 2019 | B2 |
10282907 | Miller et al. | May 2019 | B2 |
10288419 | Abovitz et al. | May 2019 | B2 |
10295338 | Abovitz et al. | May 2019 | B2 |
10296792 | Spizhevoy et al. | May 2019 | B2 |
10302957 | Sissom | May 2019 | B2 |
10304246 | Schowengerdt et al. | May 2019 | B2 |
10306213 | Sissom et al. | May 2019 | B2 |
D850103 | Natsume et al. | Jun 2019 | S |
10313639 | Wei | Jun 2019 | B2 |
10313661 | Kass | Jun 2019 | B2 |
10317690 | Cheng | Jun 2019 | B2 |
10332315 | Samec et al. | Jun 2019 | B2 |
10337691 | Kaehler et al. | Jul 2019 | B2 |
10338391 | Yeoh et al. | Jul 2019 | B2 |
10343015 | Marty et al. | Jul 2019 | B2 |
10345590 | Samec et al. | Jul 2019 | B2 |
10345591 | Samec et al. | Jul 2019 | B2 |
10345592 | Samec et al. | Jul 2019 | B2 |
10345593 | Samec et al. | Jul 2019 | B2 |
10352693 | Abovitz et al. | Jul 2019 | B2 |
10359631 | Samec et al. | Jul 2019 | B2 |
10359634 | Yeoh et al. | Jul 2019 | B2 |
10360685 | Marty et al. | Jul 2019 | B2 |
10365488 | Samec et al. | Jul 2019 | B2 |
10371876 | Menezes et al. | Aug 2019 | B2 |
10371896 | Yeoh et al. | Aug 2019 | B2 |
10371945 | Samec et al. | Aug 2019 | B2 |
10371946 | Samec et al. | Aug 2019 | B2 |
10371947 | Samec et al. | Aug 2019 | B2 |
10371948 | Samec et al. | Aug 2019 | B2 |
10371949 | Samec et al. | Aug 2019 | B2 |
10378882 | Yeoh et al. | Aug 2019 | B2 |
10378930 | Kaehler | Aug 2019 | B2 |
10379350 | Samec et al. | Aug 2019 | B2 |
10379351 | Samec et al. | Aug 2019 | B2 |
10379353 | Samec et al. | Aug 2019 | B2 |
10379354 | Samec et al. | Aug 2019 | B2 |
20050143154 | Bush | Jun 2005 | A1 |
20080015061 | Klein | Jan 2008 | A1 |
20090147992 | Tong | Jun 2009 | A1 |
20130095924 | Geisner et al. | Apr 2013 | A1 |
20150382076 | Davisson | Dec 2015 | A1 |
20170072283 | Davisson | Mar 2017 | A1 |
20190087661 | Lee et al. | Mar 2019 | A1 |
20190392729 | Lee | Dec 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
20200078658 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
62729532 | Sep 2018 | US |