This invention relates to a system and method related to a game play environment.
Conventional methods of tracking an object (e.g., golf ball, basketball, baseball, etc.) employ various types of sensors including Doppler radar technology, camera-based technology, high speed 3D camera-based technology, and stereoscopic sensors. The sensors can be configured to track the object and with the aid of a computer can recreate the movement of the object in a computerized virtual environment. In athletic application, these tracking systems have been used to provide feedback for coaching, player development, and other training/improvement applications, with focus on the movement of a virtual object relative to a virtual environment. The prior art is principally focused on providing analysis about the player's technique and the resulting effect on the flight path of the object. These systems have found a particular benefit in the area of golf instruction.
Conventional indoor golf simulators utilize sensors, as mentioned above, to represent data points in a virtual space, which are projected onto a screen into which the golf ball is hit. Such simulators monitor the initial flight of the ball with sensors, which extrapolate the full flight of the ball and relay those data points to a computer system that creates a representation of the data points in a virtual space, such as a virtualized hole on a golf course. The prior art focuses on capturing the data points and incorporating the data points into a predominately virtual environment, with no identifiable links to the physical environment where the golf ball was actually hit.
It is apparent that there is a need for a system and method of tracking a ball, or other object, and rendering the flight path of that ball in a virtual gaming environment that is coordinated with the physical environment in which the ball is struck as well as providing games and results based on various targets within the physical or virtual environment. Additionally, there is the need to alter the game play environment that the user experiences without changing the physical environment into which the ball or object is hit and to use sensors to track the entire flight of the ball in the actual physical environment. The present invention is focused on solving such a need and providing the techniques thereby to fulfil these needs.
Before proceeding to a detailed description of the invention, however, it should be noted and remembered that the description of the invention which follows, together with the accompanying drawings, should not be construed as limiting the invention to the examples (or embodiments) shown and described. This is so because those skilled in the art to which the invention pertains will be able to devise other forms of this invention within the ambit of the appended claims.
Described herein is a game-play environment that includes a tee box, a range surface, and a monitor. The tee box is configured to allow a player to hit a golf ball onto the range surface. The range surface has a plurality of physical markers. The monitor is positioned so that the player can see the monitor while in the tee box. The monitor depicts a virtual environment that corresponds to a desired virtual game. Depending on the particular game selected, a set of virtual components are displayed on the monitor. Some of these virtual components are visual cues that correspond to the physical markers on the range surface. The player can achieve the game's objectives by targeting the appropriate physical marker that corresponds to the desired visual cue.
The foregoing has outlined in broad terms some of the more important features of the invention disclosed herein so that the detailed description that follows may be more clearly understood, and so that the contribution of the named inventors to the art may be better appreciated. The invention is not to be limited in its application to the details of the construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. Rather, the invention is capable of other embodiments and of being practiced and carried out in various other ways not specifically enumerated herein. Finally, it should be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting, unless the specification specifically so limits the invention.
These and further aspects of the invention are described in detail in the following examples and accompanying drawings.
While this invention is susceptible of embodiment in many different forms, there is shown in the drawings, and will herein be described hereinafter in detail, some specific embodiments of the invention. It should be understood, however, that the present disclosure is to be considered an exemplification of the principles of the invention and is not intended to limit the invention to the specific embodiments so described.
In accordance with an illustrative embodiment of the invention, a game-play environment 10 includes a tee box 100 and a range surface 200. The tee box 100 includes a ball 110 and a monitor 400. The range surface 200 includes a plurality of physical markers 210, 220, 230, and 240.
Turning to
Some of these visual components correspond with physical aspects of the range surface 200 and tee box 100. For example, the visual cues 450, 460, 470, and 480 correspond with the physical markers 210, 220, 230, and 240 respectively. Importantly, the relative positions and distances between the physical markers 210, 220, 230, and 240 are the same relative positions and distances depicted between the visual cues 450, 460, 470, and 480. It will be understood that by depicting a plurality of visual cues in the virtual environment that correspond to a plurality of physical markers on the range surface 200, various desirable features of the golf game become possible. It will be further understood that other games can benefit from the correspondence of physical markers with visual cues, including without limitation baseball, football, ultimate frisbee, tennis, and others.
One such benefit is that after a player 300 strikes the golf ball 110, the place that the golf ball 110 comes to rest on the range surface 200 can be depicted within the virtual environment as being in a position and distance from each of the plurality of visual cues that corresponds to position and distance of the golf ball 110 from each of the plurality of physical markers on the range surface. For example, if the resting place of the golf ball 110 is 10 feet north of physical marker 220, 15 feet west of physical marker 230, and 40 feet south of physical marker 240, the monitor 400 will display a virtual golf ball 490 as being 10 feet north of visual cue 450, 15 feet west of visual cue 460, and 40 feet south of 470.
Another benefit of depicting a plurality of visual cues in the virtual environment that correspond to a plurality of physical markers on the range surface 200, is that the actual path that the golf ball 110 travels from the tee box 110 to the range surface 200 can be depicted within the virtual environment and displayed on the monitor 400.
Yet another benefit of depicting a plurality of visual cues in the virtual environment that correspond to a plurality of physical markers on the range surface 200 is that the player 300 can use the plurality of physical markers as targets that correspond to particular visual components depicted within the virtual environment. For example turning to
It will be understood that the virtual environment may also be adjusted so that the visual cues 450, 460, 470, and 480 that correspond to physical markers 210, 220, 230, and 240 are better aligned with the desired visual components. For example,
It will be further understood that other visual components can be displayed to assist the player 300 in aiming. For example, in
It will be understood that other visual components appropriate to a golf game, if desired, may be used including without limitation fairways, sand traps, virtual tee boxes, water hazards, and out of bounds markers. In addition, it may be desirable to include other visual components to be depicted within the virtual environment that do not necessarily affect the play of the virtual golf game, but rather fill the background of the virtual environment, including without limitation, rivers, lakes, houses and other structures, mountains, trees, oceans, cliffs, clouds, and other weather-related constructs.
Turning back to
It is understood that the various embodiments of the game have different objectives and goals. The objective is to get the virtual golf ball 490 to the cup on the golf green 498 taking the fewest number of golf shots possible.
Turning now to
The plurality of selectable menu items 502 allow a player of the virtual golf game to select the desired virtual environment they wish to encounter during their play time. Each of the virtual environments are displayed to the user on a screen or monitor and are imposed virtually onto the physical range, which allows the physical range to remain unchanged and present a variety of virtual environments to the player(s). The plurality of selectable menu items 502 may include, but is not limited to virtual golf games, courses, tournaments, coaching levels and practice environments.
Each of the selectable menu items 502 may include a variety of options which may be selected to determine the desired virtual environment. For example, the selectable menu item 502 entitled “Games” may include the virtual game of golf described above, card games such as 21 or poker, darts, long drive, HORSE, and the like. By way of further example, the selectable menu item 502 entitled “Courses” may include a list of golf courses with different environments and topography which may be selected. The selectable menu item 502 entitled “Tournaments” may include a list of various contests, or special instances of a game or golf course, in which there are a limited number of players or entries allowed, and within which players can earn rewards for earning the best score. The selectable menu item 502 entitled “Coaching” may provide a list of various coaching modules to select, such as distance training, use of various clubs, swing modification, etc. Also, the selectable menu item 502 entitled “Practice” may provide a list of practice courses, clubs, greens, distances and other variables which may be selected.
The game setup screen 500 also allows additional players and player information to be added, deleted, or revised by selecting from the plurality of player boxes 504. Additionally, the skill level of a particular player may be added or changed by selecting the drop-down menu 506 associated with a particular player box 504. Referring to
Turning to
Each of the target layouts in
When the next button 510 from the game setup screen 500 is pressed, and a particular game has been selected from the plurality of selectable menu items 502, the screen will display a new screen showing the game settings for the game selected. Turning to
Turning now to
One or more players take turns trying to hit a particular playing card from the plurality of playing cards 540 with a sports ball to win the card or value assigned to the playing card. The players may take numerous turns to attempt to reach a certain total value, such as 21 for blackjack, or the values necessary to win 3 card or 5 card poker, etc. After all of the players have made the set number of attempts, the system resolves the players' resulting hands against each other based on the rules and hand hierarchy as required by the game and then announces the winner to be the player with the best hand. For example, in Blackjack, or 21, the player with the points total closest to 21 without going over 21 wins. Additionally, if there is a tie, the tied players may be allowed to take another turn to attempt to hit the highest value card on the virtual playing field. If a player does not hit a card during a turn, the player may have the opportunity to draw a card from the cards remaining in the deck, which is randomly assigned by the computer system. The probability of drawing a card helpful to that player's hand can be weighted such that a player of higher skill is more likely to draw a card that is unhelpful, while a player of lower skill is more likely to draw a card that is helpful based on the cards in the player's hand during that turn.
Turning now to
One or more players take turns trying to hit a particular target from the plurality of visual targets 540 with a sports ball to win the value assigned to that target and/or the rings around the target. After all of the players have made the set number of attempts, the system resolves each player's resulting scores against the other player's scores by summing the total number of points for each player. Additionally, if there is a tie, the tied players may be allowed to take another turn to attempt to hit the highest value target on the virtual playing field. If a player misses all of the targets during a turn, the player may be awarded a minimum number of points based on the skill level of the player.
Although the plurality of visual targets 540 are shown in two dimensions, such that they are lying flat on the playing surface area on the X-Y axis, it will be understood that the plurality of visual targets 540 could also be displayed in three dimensions similar to the plurality of playing cards 530 in
Although
It will also be understood that although the virtual environment shown in
Turning to
As used herein, the term “computer” may refer, but is not limited to a laptop or desktop computer, or a mobile device, such as a desktop, laptop, tablet, cellular phone, smart phone, personal media user (e.g., iPod), wearable computer, implantable computer, or the like. Such computing devices may operate using one or more operating systems, including, but not limited to, Windows, MacOS, Linux, Unix, iOS, Android, Chrome OS, Windows Mobile, Windows CE, Windows Phone OS, Blackberry OS, and the like.
As used herein, the term “mobile device” may refer, but is not limited to any computer, as defined herein, that is not fixed in one location. Examples of mobile devices include smart phones, personal media users, portable digital assistants, tablet computers, wearable computers, implanted computers, and laptop computers.
The system and method described herein may be deployed in part or in whole through network infrastructures. The network infrastructure may include elements such as computing devices, servers, routers, hubs, firewalls, clients, personal computers, communication devices, routing devices and other active and passive devices, modules and/or components as known in the art. The computing and/or non-computing device(s) associated with the network infrastructure may include, apart from other components, a storage medium such as flash memory, buffer, stack, RAM, ROM and the like. The processes, methods, program codes, instructions described herein and elsewhere may be executed by one or more of the network infrastructural elements.
The computer software, program codes, and/or instructions may be stored and/or accessed on machine readable media that may include: computer components, devices, and recording media that retain digital data used for computing for some interval of time; semiconductor storage known as random access memory (RAM); mass storage typically for more permanent storage, such as optical discs, forms of magnetic storage like hard disks, tapes, drums, cards and other types; processor registers, cache memory, volatile memory, non-volatile memory; optical storage such as CD, DVD; removable media such as flash memory (e.g. USB sticks or keys), floppy disks, magnetic tape, paper tape, punch cards, standalone RAM disks, Zip drives, removable mass storage, off-line, and the like; other computer memory such as dynamic memory, static memory, read/write storage, mutable storage, read only, random access, sequential access, location addressable, file addressable, content addressable, network attached storage, storage area network, bar codes, magnetic ink, and the like, and preferably includes at least one tangible, non-transitory medium storing instructions executable to cause the system to perform functions described herein. Preferably, the computer-readable storage device includes a tangible, non-transitory medium. Such non-transitory media excludes, for example, transitory waves and signals. “Non-transitory memory” should be interpreted to exclude computer readable transmission media, such as signals, per se.
The systems and/or methods described herein, and steps thereof, may be realized in hardware, software or any combination of hardware and software suitable for a particular application. The hardware may include a general-purpose computer and/or dedicated computing device or specific computing device or particular aspect or component of a specific computing device. The processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory. The processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the processes may be realized as a computer executable code capable of being executed on a machine-readable medium.
The computer executable code may be created using a structured programming language such as C, an object oriented programming language such as .NET and C++, a lightweight data-interchange programming language such as JavaScript Object Notation (JSON) data-interchange format over HTTP POST request/response, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software, or any other machine capable of executing program instructions.
Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof. In another aspect, the processes may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.
It is to be understood that the terms “including”, “comprising”, “consisting” and grammatical variants thereof do not preclude the addition of one or more components, features, steps, or integers or groups thereof and that the terms are to be construed as specifying components, features, steps or integers.
If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
It is to be understood that where the claims or specification refer to “a” or “an” element, such reference is not to be construed that there is only one of that element.
It is to be understood that where the specification states that a component, feature, structure, or characteristic “may”, “might”, “can” or “could” be included, that particular component, feature, structure, or characteristic is not required to be included.
Where applicable, although state diagrams, flow diagrams or both may be used to describe embodiments, the invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.
Methods of the instant disclosure may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
The term “method” may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
It should be noted that where reference is made herein to a method comprising two or more defined steps, the defined steps can be carried out in any order or simultaneously (except where context excludes that possibility), and the method can also include one or more other steps which are carried out before any of the defined steps, between two of the defined steps, or after all of the defined steps (except where context excludes that possibility).
Still further, additional aspects of the instant invention may be found in one or more appendices attached hereto and/or filed herewith, the disclosures of which are incorporated herein by reference as if fully set out at this point.
Thus, the invention is well adapted to carry out and attain the ends and advantages mentioned above as well as those inherent therein. While the inventive concept has been described and illustrated herein by reference to certain illustrative embodiments in relation to the drawings attached thereto, various changes and further modifications, apart from those shown or suggested herein, may be made therein by those of ordinary skill in the art, without departing from the spirit of the inventive concept the scope of which is to be determined by the following claims.
This application is a continuation of and claims priority to U.S. patent application Ser. No. 14/819,999, filed Aug. 6, 2015, now U.S. Pat. No. 11,027,193, issued Jun. 8, 2021, which claims priority to and is a continuation-in-part of, U.S. patent application Ser. No. 14/321,333, filed on Jul. 1, 2014, which claims the benefit of U.S. Provisional Patent Application No. 61/841,544, filed on Jul. 1, 2013. This application incorporates each of the foregoing applications by reference in its entirety into this document as if fully set out at this point.
Number | Name | Date | Kind |
---|---|---|---|
4137566 | Haas et al. | Jan 1979 | A |
4192510 | Miller | Mar 1980 | A |
4283056 | Miller | Aug 1981 | A |
4673183 | Trahan | Jun 1987 | A |
5092602 | Witler et al. | Mar 1992 | A |
5150895 | Berger | Sep 1992 | A |
5246232 | Eccher et al. | Sep 1993 | A |
5290037 | Witler et al. | Mar 1994 | A |
5303924 | Kluttz | Apr 1994 | A |
5342051 | Rankin et al. | Aug 1994 | A |
5375832 | Witler et al. | Dec 1994 | A |
5398936 | Kluttz et al. | Mar 1995 | A |
5401026 | Eccher et al. | Mar 1995 | A |
5413345 | Nauck | May 1995 | A |
5486002 | Witler et al. | Jan 1996 | A |
5489099 | Rankin et al. | Feb 1996 | A |
5653642 | Bonacorsi | Aug 1997 | A |
5700204 | Teder | Dec 1997 | A |
5743815 | Helderman | Apr 1998 | A |
5820496 | Bergeron | Oct 1998 | A |
5879246 | Gebhardt et al. | Mar 1999 | A |
6012987 | Nation | Jan 2000 | A |
6179720 | Rankin et al. | Jan 2001 | B1 |
6217444 | Kataoka et al. | Apr 2001 | B1 |
6304665 | Cavallaro et al. | Oct 2001 | B1 |
6320173 | Vock et al. | Nov 2001 | B1 |
6322455 | Howey | Nov 2001 | B1 |
6371862 | Reda | Apr 2002 | B1 |
6373508 | Moengen | Apr 2002 | B1 |
6409607 | Libit et al. | Jun 2002 | B1 |
6437559 | Zajac et al. | Aug 2002 | B1 |
6520864 | Wilk | Feb 2003 | B1 |
6547671 | Mihran | Apr 2003 | B1 |
6607123 | Jollifee et al. | Aug 2003 | B1 |
6702292 | Takowsky | Mar 2004 | B2 |
6764412 | Gobush et al. | Jul 2004 | B2 |
6781621 | Gobush et al. | Aug 2004 | B1 |
6898971 | Dilz | May 2005 | B2 |
6905339 | DiMare et al. | Jun 2005 | B2 |
6974391 | Ainsworth et al. | Dec 2005 | B2 |
6998965 | Luciano et al. | Feb 2006 | B1 |
7040998 | Jolliffe et al. | May 2006 | B2 |
7052391 | Luciano | May 2006 | B1 |
7059974 | Jolliffe et al. | Jun 2006 | B1 |
7095312 | Erario et al. | Aug 2006 | B2 |
7143639 | Gobush | Dec 2006 | B2 |
7160196 | Thirkettle et al. | Jan 2007 | B2 |
7214138 | Stivers et al. | May 2007 | B1 |
7223169 | Imaeda et al. | May 2007 | B2 |
7317388 | Kawabe et al. | Jan 2008 | B2 |
7321330 | Sajima | Jan 2008 | B2 |
7337965 | Thirkettle et al. | Mar 2008 | B2 |
7344446 | Wyeth | Mar 2008 | B2 |
7497780 | Kiraly | Mar 2009 | B2 |
7641565 | Kiraly | Jan 2010 | B2 |
7787886 | Markhovsky et al. | Aug 2010 | B2 |
7815516 | Mortimer et al. | Oct 2010 | B1 |
7822424 | Markhovsky et al. | Oct 2010 | B2 |
7837572 | Bissonnette et al. | Nov 2010 | B2 |
7843429 | Pryor | Nov 2010 | B2 |
7854669 | Marty | Dec 2010 | B2 |
8018375 | Alexopoulos et al. | Sep 2011 | B1 |
8068095 | Pryor | Nov 2011 | B2 |
8077917 | Forsgren | Dec 2011 | B2 |
8113964 | Lindsay | Feb 2012 | B2 |
8142302 | Balardeta et al. | Mar 2012 | B2 |
8257189 | Koudele et al. | Sep 2012 | B2 |
8328653 | Lock | Dec 2012 | B2 |
8335345 | White et al. | Dec 2012 | B2 |
8400346 | Hubbard et al. | Mar 2013 | B2 |
8409024 | Marty et al. | Apr 2013 | B2 |
20050227792 | McCreary et al. | Oct 2005 | A1 |
20070078018 | Kellogg et al. | Apr 2007 | A1 |
20070293331 | Tuxen | Dec 2007 | A1 |
20080139330 | Tuxen | Jun 2008 | A1 |
20080182685 | Marty et al. | Jul 2008 | A1 |
20080261711 | Tuxen | Oct 2008 | A1 |
20090036237 | Nipper et al. | Feb 2009 | A1 |
20090295624 | Tuxen | Dec 2009 | A1 |
20100137079 | Burke et al. | Jun 2010 | A1 |
20110077093 | Garratt | Mar 2011 | A1 |
20110230986 | Lafortune et al. | Sep 2011 | A1 |
20110286632 | Tuxen et al. | Nov 2011 | A1 |
20120068879 | Tuxen | Mar 2012 | A1 |
20130039538 | Johnson et al. | Feb 2013 | A1 |
20130084930 | Chang et al. | Apr 2013 | A1 |
20130274025 | Luciano et al. | Oct 2013 | A1 |
20160287967 | Baldwin et al. | Oct 2016 | A1 |
Number | Date | Country |
---|---|---|
2839362 | Oct 2012 | CA |
1120964 | Apr 1996 | CN |
1556721 | Dec 2004 | CN |
101078898 | Nov 2011 | KR |
03022369 | Oct 2003 | WO |
2007037705 | Apr 2007 | WO |
2011065804 | Jun 2011 | WO |
2012134208 | Oct 2012 | WO |
Entry |
---|
“Examination Report—94(3) for European Application No. 14819897.1 dated Mar. 10, 2023”. |
“Examination Report—94(3) for European Application No. 14819897.1 dated Dec. 16, 2021”. |
“Examination Report—94(3) for European Application No. 14819897.1 dated Jun. 4, 2020”. |
“Examination Report—94(3) for European Application No. 14819897.1 dated Oct. 8, 2018”. |
“Examination Report in UAE Application No. P1722/2015 dated Mar. 17, 2021”. |
“Examination report No. 1 for Australian Patent Application No. 2014284410 dated Mar. 2, 2019”. |
“Examination report No. 1 for Australian Patent Application No. 2020201531 dated Dec. 2, 2020”. |
“Examination report No. 2 for Australian Patent Application No. 2020201531 dated Nov. 24, 2021”. |
“Examiner's Requisition for Canadian Patent Application No. 2916462 dated Jun. 2, 2021”. |
“Final Office Action for Japanese Patent Application No. 2021-022027 dated Jul. 25, 2022”. |
“First Examination Report for India Patent Application No. 11509/DELNP/2015 dated Sep. 9, 2019”. |
“First Office Action for Chinese Patent Application No. 201480037296.2 dated Feb. 14, 2017”. |
“First Office Action for Chinese Patent Application No. 201811316634.2 dated Jun. 25, 2021”. |
“First Office Action for Japanese Patent Application No. 2016-524324 dated May 28, 2018”. |
“First Office Action for Japanese Patent Application No. 2019-000502 dated Mar. 9, 2020”. |
“First Office Action for Japanese Patent Application No. 2021-022027 dated Feb. 2, 2022”. |
“Office Action for Canadian Patent Application No. 2916462 dated Jul. 21, 2021”. |
“Search and Examination Report in UAE Application No. P1722/2015”. |
“Second Office Action for Chinese Patent Application No. 201480037296.2 dated Jul. 10, 2017”. |
“Second Office Action for Chinese Patent Application No. 201811316634.2 dated Feb. 23, 2022”. |
“Second Office Action for Japanese Patent Application No. 2019-000502 dated Sep. 23, 2020”. |
“Supplementary European Search Report for European Patent Application No. 14819897.1 dated Jan. 20, 2017”. |
“Third Office Action for Chinese Patent Application No. 201480037296.2 dated Dec. 29, 2017”. |
Number | Date | Country | |
---|---|---|---|
20210291040 A1 | Sep 2021 | US |
Number | Date | Country | |
---|---|---|---|
61841544 | Jul 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14819999 | Aug 2015 | US |
Child | 17338839 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14321333 | Jul 2014 | US |
Child | 14819999 | US |