Basketball shooting game using smart glasses

Abstract
The present invention relates to a basketball shooting game played using smart glasses, and more specifically, to a real-time basketball shooting game that enables at least two players at basketball courts at remote locations anywhere in the world to play a real basketball shooting game with one another using smart glasses.
Description
BACKGROUND

The game of basketball is a competitive sport immensely popular worldwide. It is played professionally, collegiately, in high schools, middle schools and elementary schools, and among friends and family as well as individually. The game of basketball may be played in numerous ways, and, for example, may take the form of an organized game between teams, a pickup game at a local park, or a game of HORSE in one's driveway.


Athletes often spend numerous hours practicing and training in order to improve their skill level so they can become more competitive in the game of basketball. In an effort to assist athletes in improving their skill level, systems have been developed that track an athlete's performance while training or playing, and then provide feedback to the athlete indicative of the performance which can be analyzed and evaluated to help the athlete improve one's skill level.


Game systems have a different focus which is primarily directed toward enhancing the recreational aspect of basketball. Such game systems may involve shooting competitions between family members and/or friends at one location.


SUMMARY

In one aspect of the invention, a basketball shooting game method for at least two basketball players on basketball courts at remote locations is provided. The method includes the steps of matching up two players at remote locations that have selected to play the same basketball shooting game, acquiring real-time camera data from smart glasses that each player wears while the players are playing the selected basketball shooting game, analyzing the camera data to determine make/miss data for each player based upon how many shots each player made and missed during the selected basketball game and communicating the camera data and the make/miss data of one player to the other player.


In another aspect of the invention, a method for enabling two basketball players on basketball courts at remote locations to play a selected basketball game against one another is provided. The method includes the steps of matching up two basketball players on basketball courts at remote locations to play a selected basketball shooting same, transmitting the real-time camera data from smart glasses worn by each player to a smart device of the other player, analyzing the data from each player's smart glasses to determine how many shots each player made and missed during the selected basketball game and transmitting the number of shots made and missed by each player to the other player's smart device.


In another aspect of the invention, a method for enabling two basketball players on basketball courts at remote locations to play a selected basketball shooting game against one another is provided. The method includes the steps of matching up two players to play a selected basketball shooting same, transmitting the real-time camera data from smart glasses worn by each player to a smart device of the other player, analyzing the data from each player's smart glasses to recognize the basketball rim and backboard, analyzing the data from each player's smart glasses to determine each player's position on their basketball court, analyzing the data from each player's smart glasses to track the basketball, analyzing the data from each player's smart glasses to determine if each shot taken by each player was a make or miss without the need for sensors on the basketball, rim or backboard, displaying augmented reality information related to the selected basketball game on each player's smart glasses and transmitting the number of shots made and missed by each player to both players smart devices.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic view of one embodiment of the invention;



FIG. 2 is a flowchart of a basketball game being played by two players at basketball courts in remote locations utilizing the invention;



FIG. 3 is a schematic view of smart glass of the invention;



FIG. 4 is a schematic of an interface of an application of the present invention;



FIG. 5 is a schematic of the interface of the application;



FIG. 6 is a flowchart of the data processing of the invention;



FIG. 7 is a schematic of the collision detection boxes of the make/miss analysis; and



FIG. 8 is an illustration of the display on the smart glasses showing an augmented reality graphic.





Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of constructions and the arrangement of components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.


DETAILED DESCRIPTION


FIG. 1 illustrates one embodiment of a basketball shooting game system 10 in accordance with the present invention. The system 10 preferably includes smart glasses 12 and an application 14 running on a smart device 16 for each game player 18 and 20. The smart device 16 is preferably a mobile phone or tablet, but other devices can also be utilized. Alternatives to the application 14 running on a smart device 16 can also be utilized, such as a dedicated computer with software loaded thereon. The system 10 enables the playing of a real-time basketball shooting game between at least two players 18 and 20 on basketball courts at remote locations anywhere in the world. The game is played on a physical basketball court, whether at a gym, a park, or a backyard, and played in connection with a conventional basketball 22, a rim 24 and backboard 26. Each player 18 and 20 wears smart glasses 12 that analyze and process data and that communicate with the player's device 16. The device 16 receives and processes the data from each player's smart glasses 12 and communicates with a remote processor 28. The smart glasses 12 provide camera data, track the position of the player 18 or 20, recognize the basketball 22 and basketball backboard 26 in use by each player 18 or 20 and will detect each player's shooting makes/misses so there is no need for additional sensors on the basketball rim 24, the backboard 26, or the basketball 22.


The smart glasses 12 worn by each player 18 and 20 include a processor 30 and a camera 32, as is shown in FIG. 3 and as will be discussed in detail below, so that the second player 20 at the remote location from the first player 18 can see everything the first player 18 sees. Real-time image, video and data are communicated to both players 18 and 20 such as by being transmitted from the first player's glasses 12 to the first player's device 16 as well as to the second player's device 16. Every time one player makes a shot, a green make will appear on the other player's device 16 and every time one player misses a shot, a red miss will be shown on the other player's device 16. The make/miss data of each player 18 and 20 are also displayed on that player's smart glasses 12, as is shown in FIG. 7 and as will be discussed in more detail below, along with other data such as location, temperature, and distance from the basketball rim. The smart glasses 12 include built in communication so that two players 18 and 20 playing against each other can talk to one another while playing the game as well as listen to music.


An example of a basketball shooting game utilizing the invention is as follows and as shown in FIG. 2. A first player 18 logs into the application 14 on the first player's device 16 at step 34. The first player 18 selects a particular game from a list of options such as, for example, a three-point competition or a free-throw competition at step 36. The first player 10 is then matched up with a second player 20 at a basketball court in a remote location who also is logged into the application 14 on the second player's device 16 and who is interested in playing the same game at steps 38, 40, 42. Once both players 18 and 20 are matched and the game determined, the players 18 and 20 put on their individual smart glasses 12 so that the camera 32 is activated and each player 18 and 20 can see what the other player sees on the basketball court at step 44. The players 18 and 20 then set aside their individual devices 16 such as in their pocket, on a chair, on a fence, or any other convenient location that is nearby so they can then begin the selected shooting game at step 46. Both players 18 and 20 can play the selected game at the same time as shown in steps 48 and 50 or they can play one at a time while watching the other player on their device 16. If the players 18 and 20 play the selected game one at a time, each player 18 and 20 can monitor the current data and scores of the other player as well as make/miss data of the other player on their device 16. The application 14 can also provide for replays of each player's shots if desired. The application 14 also provides a timer and a game end alarm if the selected game requires one.


The images, video as well as other data collected and processed by player's glasses 12 is communicated to the other player such as be being wirelessly transmitted to the player's respective device 16 which is also communicated to the other player's device 16 as is known in the art. With this system 10, the capture of images, video and data by the glasses 12 of the first player 18 at one basketball court location can be viewed in real-time by the second player 18 at a remote basketball court location and vice versa thus enabling the play of basketball games by one player 18 against another player 20 anywhere in the world.


After the selected game is completed at step 52, each player 18 and 20 can go back to their device 16 at steps 54 and 56 to view their own scores and data, to view the other player's scores and data and to review which player 18 or 20 won the game at step 58. Each player 18 and 20 can then set up a rematch, proceed to another opponent or game or quit at step 60.


It should be noted that in the above example, two players 18 and 20 are matched up and playing the selected game against each other. However, the invention can be scaled so that any number of players can play against one another if each player has smart glasses and a device to access the application.


In another example of the system 10, indicia and/or graphics 62 can be projected by the glasses 12, as is shown in FIG. 7 and as will be discussed in detail below, to augment reality in addition to showing the make/miss data. For example, a player 18 or 20 could see a famous basketball player in their field of view and be required to take a basketball shot while seeing the famous player. In another example, graphics of basketball stadiums and fans can be added to make a player's current location look like they are playing in a large arena. In another example, graphics can be added to the ball itself, as if the ball were of fire, or the court itself, as if the court were in a tundra of snow.


Turning now to FIG. 3, an exemplary component design for the smart glasses 12 is shown. The glasses 12 can be commercially available smart glasses, such as HoloLens 2 available from Microsoft Corporation, with no additional hardware added. Alternatively, the glasses 12 can be custom designed and manufactured with the needed components. It should be noted that almost any pair of commercially available smart glasses will work in this system 10.


The glasses 12 include right and left lenses 64 on a wearable frame 66. The lenses 64 allow natural light in thus allowing the wearer to see the real world. The field of view (FOV) of most lenses 64 is preferably in the range of 30 degrees to 180 degrees. A positioning device 68 is mounted on the frame 66 and functions to determine a physical location of the wearer. The positioning device 68 may include both a geographic locator such as a GPS device known in the art that utilizes signals from GPS satellites to determine the current position of a player 18 or 20 as well as orientation devices such as known accelerometers and 3-axis gravity detectors that determine the direction the player 18 or 20 is looking while wearing the glasses 12. The camera 32 is mounted to the frame 66 and captures images and video in the FOV. A display 70 is mounted on the frame 66 spaced in front of one or both lenses 64. The display 70 is preferably a small video screen such as a micro-LED display that the player 18 or 20 looks directly at or is a projection device that displays information such as images, video, and other data onto one of both lenses 64. For example, every time a player 18 or 20 makes a shot, a green MAKE will show on the display 70 in real-time, and every time the player 18 or 20 misses a shot, a red MISS will show on the display 70 in real time. The display 70 functions to overlay such information and/or graphics onto a user's FOV such that what a player 18 or 20 is seeing through the lenses 64 is augmented with the overlaid information 62 thus providing an augmented reality situation. The processor 30 is mounted to the frame 66 for analyzing and processing the data from the camera 32 such as by using computer vision (CV) techniques and communicating that data to each player's device 16.


The glasses 12 preferably include a built-in microphone/speaker 72 and earphone or earpiece 74. The microphone/speaker 72 permits a player 18 or 20 to speak to the other player in real time while playing the basketball game, and earpiece 74 enables the players 18 and 20 to hear what the other players are saying in response. The glasses 12 include a conventional standard transceiver 76 in order to wirelessly communicate information such as the camera images and/or other data relating to a player 18 or 20 and communicate through Bluetooth Low energy (BLE). The smart glasses 12 have a server manager code that contains functions for several different server file protocols including TCP, UDP, MIX and TLA. The file transfer functions preferably use multithreading to transfer and process the file bytes quickly and efficiently. The glasses 12 are powered such as by battery and/or electric charge.


As noted above, commercially available smart glasses can be utilized as part of the invention or, alternatively, specifically designed custom-made glasses can be designed and manufactured such as with more than one camera and faster processing speeds.


Turning now to FIG. 4, there is illustrated an exemplary graphical user interface 78 of the application 14 which is accessed by each player's device 16. It should be noted that the interface 78 can be designed and look differently and display different information as desired. The interface 78 includes a miniature basketball court graphic 80 where the location of each player's shots as makes/misses is shown for example by x's and o's and where the distance of a player's shots from the rim 24 can also be displayed. The player names 82 can be displayed and an indication 84 indicating that the first player 18 is connected to the second player 20, and the game is on. The interface 78 displays elapsed time 86 of the game being played, the total shots 88 attempted by the player 18 or 20, the number of makes/misses 90 and 92 of the player, and each player's rank 94 among all players that have played the selective basketball game. Each player's device 16 keeps track of a player's rank using any type of ranking system/method. A player's rank can be accessed even if the player 18 or 20 is not actively playing a basketball shooting game.


The interface 78 enables a player 18 or 20 to choose between several basketball games 96 by selecting from a predetermined menu. Games can include but are not limited to a three-point shooting game and a free throw shooting game. A player 18 or 20 can also choose to play the game with a different player.



FIG. 5 shows another graphical user interface 96 wherein there is a split screen 98 showing the camera data 100 from the first player 18 and the camera data 102 from the second player 20.


In order to utilize the smart glasses 12 for the basketball shooting game between two players 18 and 20 on remote basketball courts, processes are needed to track the players 18 and 20 and analyze the data from the glasses 12 by tracking the basketball 22, without need for any sensors on the rim 24, backboard 26 or basketball 22 which would be an impediment to playing the game.


Turning now to FIG. 6, with respect to player tracking, data from the positioning device 68 on the smart glasses 12 is acquired at step 104 and utilized to determine where a player 18 or 20 is on the basketball court at step 106 and utilized to mark where a player 18 or 20 takes a shot at step 108.


With respect to tracking the basketball 22, the data streamed and acquired from the camera 32 of the smart glasses 12 is utilized as shown in step 104 to identify the basketball 22 in step 110, track the basketball 22 in step 112 and determine made and missed shots in step 114. Preferably, a tool that the system 10 uses to interpret the data streamed from the camera 32 can be sourced from OpenCV, an open source CV and machine learning software library. More specifically, an implementation of the OpenCV library made solely for use in Unity OpenCV for Unity from Enox Software can be used for tracking the basketball. Unity is advantageous to enable almost all commercially available smart glasses to be used in this game system. Filters can also be used to filter artificial data from the camera data. However, other software for processing the data from the smart glasses, whether commercially available or custom written, can also be utilized with the invention.


HoloLens World Tracking can be used to identify the basketball backboard 26 in 3d space in step 116. To accomplish this, images of backboards are preferably scanned to enable the glasses 12 to recognize backboards of different designs. Alternatively, after putting the glasses 12 on, a player 18 or 20 will be asked to shoot any number of shots, such as 3-5, until the system 10 recognizes the backboard 26 in use. If desired, the entire court could be identified by the system 10 as well.


The OpenCV plugin can be used to identify and track the basketball 22 and process the camera data while tracking the backboard's position. During the processing, the OpenCV plugin functions will analyze the frames captured from the camera 32 to identify contour lines that match the circular shape of the basketball 22 as the images featured from the frame are 2d. The processing also takes into account the size of the basketball 22, the focal length of the camera 32, the diameter of the basketball 22 and the color of the basketball 22 such as through use of the OpenCV Find Contours function. Alternatively, the system 10 can request a player 18 or 20 to place the basketball 22 in front of the smart glasses 12 until the system 10 recognizes the specific basketball that is being used in the game.


The image processing within OpenCV on the processor 30 is done in as many frames as possible in order to achieve good accuracy without slowing down the processing. In order to achieve better performance, the processing can scale the input frames down to 896×504 resolution (16:9 aspect ratio). The FOV can be kept at 48 degrees and the aspect ratio is kept the same to ensure consistent accuracy. However, it should be noted that faster processing speed, higher resolution and different aspect ratios can also be utilized as known by those of skill in the art. It should also be noted that a different processor not on the smart glasses 12 can also be used in addition to or in place of the processor 30.


The processing of the camera data also preferably detects the color of the basketball 22 and converts from RGB to HSV in order to precisely determine the color of the basketball 22 when it is in the camera frame data. Filtering is preferably done so appropriately colored round orange objects such as 0.5 miles away are not identified as the basketball 22.


Turning now to the processing of basketball shot makes/misses and FIG. 7, preferably during the start of processing of the data, collision detection boxes 118 and 120 are created in order to ensure a proper basketball shot is processed. A first collision detection box 118 is positioned above the rim 24 adjacent the backboard 26 and a second collision detection box 120 is positioned below the rim 24. The system 10 detects collisions in succession with the basketball 22 along with the two collision boxes 118 and 120 that are created to overlay the camera data. Once the system 10 detects a collision between the basketball 22 and the first collision detection box 118 then the second collision detection box 120 in that order, the system records that as a basket made. The use of the focal length of the camera 32 to determine the distance of the basketball 22 from the camera 32 ensures that there are no errors in collision detection. In the 2d image processing, the basketball 22 may appear to be colliding with the collision detection boxes 118 and 120 if it is in front of or behind either of the boxes 118 and 120, however, using the area of the basketball 22 when colliding with the boxes 118 and 120 ensures that no errors are made when processing.


It should also be noted that the system 10 can also provide predictive make/miss determinations if a player 18 or 20 turns their head away from the rim 24 thus not providing any camera data. In this situation, the system 10 can predict if the shot was a make/miss based on the trajectory mapping of the shot as it approaches the rim 24 with the camera data that is available.


With respect to projecting graphics and game data 62 on the display 70 of the smart glasses 12, this is known in the art through use of a combiner that works like a partial mirror to redirect the display light and selectively let in light from the real world. FIG. 7 illustrates an example of the MAKE graphic as seen on the display 70 of the glasses 12. Other graphics and/or data that could be displayed include location, temperature, and distance from the basketball rim.


In another aspect of the invention, the camera output of the glasses 12 of the players 18 and 20 can be transmitted via a network to third parties to watch the games whether they are physically at one of the basketball courts or not.


Various features and advantages of the invention are set forth in the following claims.

Claims
  • 1. A basketball shooting game method for at least two basketball players on basketball courts at remote locations, said method comprising: matching up two players at remote locations that have selected to play the same basketball shooting game;acquiring real-time camera data from smart glasses that each player wears while the players are playing the selected basketball shooting game with each player using a basketball and a hoop;analyzing the camera data using an overlay over the camera data to determine make/miss data for each player based upon how many shots each player made and missed during the selected basketball game without using data from a sensor on the basketball and on the hoop; andcommunicating the camera data and the make/miss data of one player to the other player.
  • 2. A basketball shooting game method of claim 1 and further including the step of displaying indicia of make/miss data on each player's smart glasses.
  • 3. A basketball shooting game method of claim 1 and further including the step of displaying augmented reality graphics relating to at least one of make/miss data, famous basketball players, fans and the basketball itself on each player's smart glasses.
  • 4. A basketball shooting game method of claim 1 wherein in the communicating step, the camera data and the make/miss data is communicated to a player's smart device.
  • 5. A basketball shooting game method of claim 4 wherein the smart device includes an application loaded and running thereon.
  • 6. A basketball shooting game method of claim 1 and further including the step of determining which player won the selected basketball game and communicating that results to each player.
  • 7. A basketball shooting game method of claim 1 wherein the smart glasses enable the players to communicate via audio with each another during the playing of the selected basketball game.
  • 8. A basketball shooting game method of claim 1 wherein the smart glasses include a processor that performs the analyzing step.
  • 9. A basketball shooting game method of claim 1 and further including the step of analyzing the camera data to recognize at least one of the basketball, a the hoop and a backboard.
  • 10. A basketball shooting game method of claim 1 and further including the step of analyzing the data from each player's smart glasses to determine each player's position on their basketball court.
  • 11. A basketball shooting game method of claim 1 and further including the step of analyzing the data from each player's smart glasses to track the basketball.
  • 12. A basketball shooting game method for at least two basketball players on basketball courts at remote locations, said method comprising: matching up two players at remote locations that have selected to play the same basketball shooing game with each player using a ball and a hoop;acquiring real-time camera data from smart glasses that each player wears while the players are playing the selected basketball shooting game;analyzing the camera data using an overlay over the camera data to determine make/miss data for each player based upon how many shots each player made and missed during the selected basketball game; andcommunicating the camera data and the make/miss data of one player to the other player,wherein in the analyzing step, no data is used from additional sources.
  • 13. A method for enabling two basketball players on basketball courts at remote locations to play a selected basketball game against one another, said method comprising: matching up two basketball players on basketball courts at remote locations to play a selected basketball shooting game with each player using a basketball and a hoop;transmitting real-time camera data from smart glasses worn by each player to a smart device of the other player;analyzing the camera data from each player's smart glasses using an overlay over the camera data to determine how many shots each player made and missed during the selected basketball game without using data from a sensor on the basketball and on the hoop; andtransmitting the number of shots made and missed by each player to the other player's smart device.
  • 14. A method of claim 13 wherein the smart glasses include a processor and wherein the processor performs the analyzing step.
  • 15. The method of claim 13 wherein the smart device includes an application loaded and running thereon.
  • 16. A method for enabling two basketball players on basketball courts at remote locations to play a selected basketball shooting game against one another, said method comprising: matching up two players to play a selected basketball shooting game;transmitting real-time camera data from smart glasses worn by each player to a smart device of the other player;analyzing the data from each player's smart glasses to recognize a basketball, a rim and a backboard;analyzing the data from each player's smart glasses to determine each player's position on their basketball court;analyzing the data from each player's smart glasses to track the basketball;analyzing the data from each player's smart glasses to determine if each shot taken by each player was a make or miss using collision volumes overlaying the real-time camera data;displaying augmented reality information related to the selected basketball game on each player's smart glasses; andtransmitting the number of shots made and missed by each player to both players smart devices.
  • 17. A method of claim 16 wherein in the step of analyzing the data from each player's smart glasses to track the basketball, the shape of the basketball is utilized.
  • 18. A method of claim 16 wherein in the step of analyzing the data from each player's smart glasses to track the basketball, the color of the basketball is utilized.
  • 19. A method of claim 16 and further including the step of transmitting the camera data and shots made and missed to third parties.
  • 20. A method for enabling two basketball players on basketball courts at remote locations to play a selected basketball shooting game against one another, said method comprising: matching up two players to play a selected basketball shooting game;transmitting real-time camera data from smart glasses worn by each player to a smart device of the other player;analyzing the data from each player's smart glasses to recognize a basketball, a rim and a backboard;analyzing the data from each player's smart glasses to determine each player's position on their basketball court;analyzing the data from each player's smart glasses to track the basketball;analyzing the data from each player's smart glasses to determine if each shot taken by each player was a make or miss without the need for sensors on the basketball, the rim or the backboard;displaying augmented reality information related to the selected basketball game on each player's smart glasses; andtransmitting the number of shots made and missed by each player to both players smart devices;wherein in the step of analyzing the data from each player's smart glasses to determine if each shot made by the player was a made or miss, a collision detection box above and a collision detection box below the basketball rim are utilized to determine if shots are made or missed.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 62/729,532 filed on Sep. 11, 2018.

US Referenced Citations (228)
Number Name Date Kind
6805490 Levola Oct 2004 B2
7094164 Marty et al. Aug 2006 B2
7181108 Levola Feb 2007 B2
7184615 Levola Feb 2007 B2
7206107 Levola Apr 2007 B2
7483604 Levola Jan 2009 B2
7764413 Levola Jul 2010 B2
7850552 Marty et al. Dec 2010 B2
7854669 Marty et al. Dec 2010 B2
8160411 Levola et al. Apr 2012 B2
8194325 Levola et al. Jun 2012 B2
8254031 Levola Aug 2012 B2
8314993 Levola Nov 2012 B2
8320032 Levola Nov 2012 B2
8360578 Nummela Jan 2013 B2
8408982 Marty et al. Apr 2013 B2
8409024 Marty et al. Apr 2013 B2
8466953 Levola Jun 2013 B2
8477046 Alonso Jul 2013 B2
8494229 Jarvenpaa et al. Jul 2013 B2
8508848 Saarikko Aug 2013 B2
8547638 Levola Oct 2013 B2
8593734 Laakkonen Nov 2013 B2
8617008 Marty et al. Dec 2013 B2
8622832 Marty et al. Jan 2014 B2
8717392 Levola May 2014 B2
8774467 Ryan Jul 2014 B2
8830584 Saarikko et al. Sep 2014 B2
8908922 Marty et al. Dec 2014 B2
8913324 Schrader Dec 2014 B2
8948457 Marty et al. Feb 2015 B2
8950867 Macnamara Feb 2015 B2
9086568 Jarvenpaa Jul 2015 B2
9215293 Miller Dec 2015 B2
9238165 Marty et al. Jan 2016 B2
9283431 Marty et al. Mar 2016 B2
9283432 Marty et al. Mar 2016 B2
9310559 Macnamara Apr 2016 B2
9345929 Marty et al. May 2016 B2
9348143 Gao et al. May 2016 B2
D758367 Natsume Jun 2016 S
9358455 Marty et al. Jun 2016 B2
9370704 Marty Jun 2016 B2
9389424 Schowengerdt Jul 2016 B1
9390501 Marty et al. Jul 2016 B2
9417452 Schowengerdt et al. Aug 2016 B2
9429752 Schowengerdt et al. Aug 2016 B2
9470906 Kaji et al. Oct 2016 B2
9541383 Abovitz et al. Jan 2017 B2
9547174 Gao et al. Jan 2017 B2
9612403 Abovitz et al. Apr 2017 B2
9651368 Abovitz et al. May 2017 B2
9671566 Abovitz et al. Jun 2017 B2
9694238 Marty et al. Jul 2017 B2
9697617 Marty et al. Jul 2017 B2
D795952 Natsume Aug 2017 S
9726893 Gao et al. Aug 2017 B2
9734405 Marty et al. Aug 2017 B2
9740006 Gao Aug 2017 B2
D796503 Natsume et al. Sep 2017 S
D796504 Natsume et al. Sep 2017 S
D796505 Natsume et al. Sep 2017 S
D796506 Natsume et al. Sep 2017 S
D797735 Fraser et al. Sep 2017 S
D797743 Awad et al. Sep 2017 S
D797749 Awad et al. Sep 2017 S
9753286 Gao et al. Sep 2017 B2
9753297 Saarikko et al. Sep 2017 B2
9761055 Miller Sep 2017 B2
9766381 Jarvenpaa et al. Sep 2017 B2
9766703 Miller Sep 2017 B2
9767616 Miller Sep 2017 B2
9791700 Schowengerdt Oct 2017 B2
9804397 Schowengerdt et al. Oct 2017 B2
9830522 Mazur et al. Nov 2017 B2
9832437 Kass et al. Nov 2017 B2
9841601 Schowengerdt Dec 2017 B2
9844704 Thurman et al. Dec 2017 B2
9846306 Schowengerdt Dec 2017 B2
9846967 Schowengerdt Dec 2017 B2
9851563 Gao et al. Dec 2017 B2
9852548 Greco et al. Dec 2017 B2
9857170 Abovitz et al. Jan 2018 B2
9857591 Welch et al. Jan 2018 B2
9874749 Bradski et al. Jan 2018 B2
9874752 Gao et al. Jan 2018 B2
9881420 Miller Jan 2018 B2
9886624 Marty et al. Feb 2018 B1
9889367 Minkovitch Feb 2018 B2
9891077 Kaehler Feb 2018 B2
9904058 Yeoh et al. Feb 2018 B2
9911233 O'Connor et al. Mar 2018 B2
9911234 Miller Mar 2018 B2
9915824 Schowengerdt et al. Mar 2018 B2
9915826 Tekolste et al. Mar 2018 B2
9922462 Miller Mar 2018 B2
9928654 Miller Mar 2018 B2
9939643 Schowengerdt Apr 2018 B2
9946071 Schowengerdt Apr 2018 B2
9948874 Kaehler Apr 2018 B2
9952042 Abovitz et al. Apr 2018 B2
9972132 O'Connor et al. May 2018 B2
9978182 Yeoh et al. May 2018 B2
9984506 Miller May 2018 B2
9990777 Bradski Jun 2018 B2
9996977 O'Connor et al. Jun 2018 B2
10008038 Miller Jun 2018 B2
10010778 Marty et al. Jul 2018 B2
10013806 O'Connor et al. Jul 2018 B2
10015477 Grata et al. Jul 2018 B2
10021149 Miller Jul 2018 B2
10042097 Tekolste Aug 2018 B2
10042166 Yeoh et al. Aug 2018 B2
10043312 Miller et al. Aug 2018 B2
10048501 Gao et al. Aug 2018 B2
10060766 Kaehler Aug 2018 B2
10061130 Gao et al. Aug 2018 B2
10068374 Miller et al. Sep 2018 B2
10073267 Tekolste et al. Sep 2018 B2
10073272 Kaji et al. Sep 2018 B2
10078919 Powderly et al. Sep 2018 B2
10089453 Kaehler Oct 2018 B2
10089526 Amayeh et al. Oct 2018 B2
10092793 Marty et al. Oct 2018 B1
10100154 Bhagat Oct 2018 B2
10101802 Abovitz Oct 2018 B2
10109061 Bose et al. Oct 2018 B2
10109108 Miller et al. Oct 2018 B2
10115232 Miller et al. Oct 2018 B2
10115233 Miller et al. Oct 2018 B2
10126812 Miller et al. Nov 2018 B2
10127369 Kaehler Nov 2018 B2
10127723 Miller Nov 2018 B2
10134186 Schowengerdt et al. Nov 2018 B2
D836105 Natsume et al. Dec 2018 S
D836106 Natsume et al. Dec 2018 S
D836107 Natsume et al. Dec 2018 S
D836108 Natsume et al. Dec 2018 S
D836109 Natsume et al. Dec 2018 S
D836630 Natsume et al. Dec 2018 S
D836631 Natsume et al. Dec 2018 S
D836632 Natsume et al. Dec 2018 S
D836633 Natsume et al. Dec 2018 S
D836634 Natsume et al. Dec 2018 S
10146997 Kaehler Dec 2018 B2
10151875 Schowengerdt et al. Dec 2018 B2
10156722 Gao et al. Dec 2018 B2
10156725 TeKolste et al. Dec 2018 B2
10162184 Gao et al. Dec 2018 B2
10163010 Kaehler et al. Dec 2018 B2
10163011 Kaehler et al. Dec 2018 B2
10163265 Miller et al. Dec 2018 B2
10175478 Tekolste et al. Jan 2019 B2
10175491 Gao et al. Jan 2019 B2
10175564 Yaras Jan 2019 B2
10176639 Schowengerdt et al. Jan 2019 B2
10180734 Miller et al. Jan 2019 B2
10185147 Lewis Jan 2019 B2
10186082 Min et al. Jan 2019 B2
10186085 Greco et al. Jan 2019 B2
10191294 Macnamara Jan 2019 B2
10198864 Miller Feb 2019 B2
10203762 Bradski et al. Feb 2019 B2
10210471 King et al. Feb 2019 B2
10228242 Abovitz et al. Mar 2019 B2
10234687 Welch et al. Mar 2019 B2
10234939 Miller et al. Mar 2019 B2
10237540 Welch et al. Mar 2019 B2
10241263 Schowengerdt et al. Mar 2019 B2
D845296 Natsume et al. Apr 2019 S
D845297 Natsume et al. Apr 2019 S
10249087 Wei et al. Apr 2019 B2
10254454 Wei et al. Apr 2019 B2
10254483 Schowengerdt et al. Apr 2019 B2
10254536 Yeoh et al. Apr 2019 B2
10255529 Rabinovich et al. Apr 2019 B2
10260864 Edwin et al. Apr 2019 B2
10261162 Bucknor et al. Apr 2019 B2
10261318 TeKolste et al. Apr 2019 B2
10262462 Miller et al. Apr 2019 B2
10267970 Jones, Jr. et al. Apr 2019 B2
10275902 Bradski Apr 2019 B2
10282611 Amayeh et al. May 2019 B2
10282907 Miller et al. May 2019 B2
10288419 Abovitz et al. May 2019 B2
10295338 Abovitz et al. May 2019 B2
10296792 Spizhevoy et al. May 2019 B2
10302957 Sissom May 2019 B2
10304246 Schowengerdt et al. May 2019 B2
10306213 Sissom et al. May 2019 B2
D850103 Natsume et al. Jun 2019 S
10313639 Wei Jun 2019 B2
10313661 Kass Jun 2019 B2
10317690 Cheng Jun 2019 B2
10332315 Samec et al. Jun 2019 B2
10337691 Kaehler et al. Jul 2019 B2
10338391 Yeoh et al. Jul 2019 B2
10343015 Marty et al. Jul 2019 B2
10345590 Samec et al. Jul 2019 B2
10345591 Samec et al. Jul 2019 B2
10345592 Samec et al. Jul 2019 B2
10345593 Samec et al. Jul 2019 B2
10352693 Abovitz et al. Jul 2019 B2
10359631 Samec et al. Jul 2019 B2
10359634 Yeoh et al. Jul 2019 B2
10360685 Marty et al. Jul 2019 B2
10365488 Samec et al. Jul 2019 B2
10371876 Menezes et al. Aug 2019 B2
10371896 Yeoh et al. Aug 2019 B2
10371945 Samec et al. Aug 2019 B2
10371946 Samec et al. Aug 2019 B2
10371947 Samec et al. Aug 2019 B2
10371948 Samec et al. Aug 2019 B2
10371949 Samec et al. Aug 2019 B2
10378882 Yeoh et al. Aug 2019 B2
10378930 Kaehler Aug 2019 B2
10379350 Samec et al. Aug 2019 B2
10379351 Samec et al. Aug 2019 B2
10379353 Samec et al. Aug 2019 B2
10379354 Samec et al. Aug 2019 B2
20050143154 Bush Jun 2005 A1
20080015061 Klein Jan 2008 A1
20090147992 Tong Jun 2009 A1
20130095924 Geisner et al. Apr 2013 A1
20150382076 Davisson Dec 2015 A1
20170072283 Davisson Mar 2017 A1
20190087661 Lee et al. Mar 2019 A1
20190392729 Lee Dec 2019 A1
Related Publications (1)
Number Date Country
20200078658 A1 Mar 2020 US
Provisional Applications (1)
Number Date Country
62729532 Sep 2018 US