This application claims priority of Taiwanese Patent Application No. 105129422, filed on Sep. 10, 2016.
The disclosure relates to a wearable augmented reality device, and more particularly to a wearable augmented reality device for assisting a golfer.
A golfer usually learns correspondences between clubs and carries from practical experience accumulated during golf plays, and often estimates an actual carry based on distance markers placed in a driving range or by seeking an estimation from a caddie. However, departure of such estimations from reality may mislead the golfer into using an unsuitable club for a shot.
Additionally, a golfer who is unfamiliar with a golf course needs more information about the golf course in order to efficiently play a game.
Therefore, an object of the disclosure is to provide a wearable augmented reality device for golf play that can alleviate at least one of the drawbacks of the prior art.
According to the disclosure, the wearable augmented reality device for golf play is to be used for assisting a golfer who plays golf with a ball and a club. The wearable augmented reality device includes a camera, a memory, a processor and a display. The camera is configured to capture an image of the ball and the club which is swung by the golfer for hitting the ball so as to obtain a captured image. The memory is configured to store an image recognition module and an image calculation module. The processor is configured to execute the image recognition module so as to recognize the ball and the club in the captured image, to execute the image calculation module so as to determine whether the ball is hit by the club according to a positional relationship between the ball and the club in the captured image, to calculate flight information of the ball when it is determined that the ball is hit by the club, and to output the flight information. The display is configured to display the flight information received from the processor.
Other features and advantages of the disclosure will become apparent in the following detailed description of the embodiment with reference to the accompanying drawings, of which:
Referring to
Referring to
The battery 10 is configured to supply electrical energy for the embodiment of the wearable augmented reality device 1 of the disclosure.
The camera 11 is configured to capture an image of at least one of the ball or the club 8 so as to obtain a captured image. In this embodiment, the camera 11 is a high-speed camera.
The memory 12 is configured to store a user interface 120, an image recognition module 121, an image calculation module 122, map information 123 that is associated with the golf course 7 (including for instance, location of the green 74), an identification-specification database 124 and a club-carry database 129.
The identification-specification database 124 identification of a particular club, a template of the particular club and a set of club specifications of the particular club.
The club-carry database 129 contains multiple correlations, each between the club specifications of a particular club and a carry of the particular club.
In this embodiment, the display 14 is a projection display including a micro projector 1411 and a prism 1412. The micro projector 1411 is configured to be controlled by the processor 13 to output an image to the prism 1412 so as to project the image onto the retina of the golfer 9. However, embodiment of the display 14 is not limited to the above-mentioned implementation. For example, the display 14 may be a transparent display 142 in a variation of this embodiment as shown in
In this embodiment, the user interface 120 is operable via the touch module 17, such as a touch pad, a touch switch, etc., by virtue of touch control so as to control the wearable augmented reality device 1. Moreover, the microphone 192 and the loudspeaker 191 enable the golfer 9 to interact with the wearable augmented reality device 1 by virtue of voice control.
Referring to
Referring to
When the identification 81 on the club 8 is unrecognizable, the processor 13 executes the image recognition module 121 so as to recognize an image feature of the club 8 in the captured image, and executes the image calculation module 122 so as to determine, by comparing the image feature thus recognized with the correlations stored in the identification-specification database 124, the club specifications which corresponds to a template in the identification-specification database 124 the image feature of the club 8 matches, and a visual representation 143 of which is to be displayed on the display 14. Therefore, the club specifications of the club 8 can be determined and a visual representation thereof can be displayed by the embodiment of the wearable augmented reality device 1.
Referring to
First, in step 21, the positioning unit 151, such as a Global Positioning System (GPS) device, determines geographic position information associated with a position of the wearable augmented reality device 1, and the direction finder 152, such as a compass implemented by a magnetometer, determines geographic direction information associated with a direction of the wearable augmented reality device 1, e.g., a direction in which the wearable augmented reality device faces. The geographic position information, the geographic direction information and the map information 123 enable the wearable augmented reality device 1 to obtain orientation (in terms of position and forward-facing direction) of the wearable augmented reality device 1 with respect to the golf course 7. The processor 13 is further configured to output, according to the geographic position information, the geographic direction information and the map information 123, a message which indicates the orientation of the wearable augmented reality device 1 with respect to the golf course 7 and which is to be displayed on the display (not illustrated).
In step 22, the processor 13 is further configured to generate, according to the geographic position information, the geographic direction information and the map information 123, green information 144 associated with the green 74 in the golf course 7 that is ahead of the golfer 9, obstacle information 145 associated with an obstacle 75 in the golf course 7, and a club recommendation 149 of club specifications associated with a club suitable for a current shot. The processor 13 outputs the green information 144, the obstacle information 145 and the club recommendation 149 to the display 14 to be displayed thereon. It is worth to note that the club-carry database 129 stored in the memory 12 is established specifically for an individual golfer. Each time when the individual golfer uses the wearable augmented reality device 1 for assistance with a shot, the wearable augmented reality device 1 records statistically the club specifications of the club 8 used for the shot actually taken and the corresponding carry, and updates accordingly the club-carry database 129. Therefore, specifically, the processor 13 generates the club recommendation according to an educated club-carry database 129, the geographic position information, the geographic direction information and the map information.
Referring to
Referring to
In step 24, the camera 11 captures an image of the ball and the club 8 which is swung by the golfer 9 for hitting the ball. Then, the processor 13 executes the image recognition module 121 so as to recognize the ball and the club 8 in the captured image, and executes the image calculation module 122 so as to determine whether the ball is hit by the club 8 according to a positional relationship between the ball and the club 8 in the captured image. When it is determined by the processor 13 that the ball is hit by the club 8, the processor 13 increases the stroke number 1402 by one and outputs the updated stroke number 1402 to the display 14 for displaying the same (see
Subsequently, the processor 13 outputs the impact information, the predicted position 1461, the flight information 147 and the total distance 148 to the display 14 for displaying the same. The predicted position 1461 is provided to the golfer 9 for the ease of finding the ball. The impact information includes a face angle when the ball is hit by the club 8, a club path, and an attack angle (not shown).
As shown in
In step 25, the processor 13 determines whether or not the golfer 9 has arrived at the green 74 based on the geographic position information obtained by the position unit 151 and the map information 123 stored in the memory 12 (see
In step 26, the laser scanner 16 outputs a scanning result by scanning a grain of the green 74 and a height variation of the green 74. Furthermore, the processor 13 executes the image recognition module 121 so as to generate a green recognition result associated with the grain and the height variation of the green 74 according to the scanning result, and executes the image calculation module 122 so as to calculate, according to the green recognition result, a putting suggestion which is associated with a putting direction and a putting force of a putting stroke to be used on the green 74 and which is to be displayed on the display 14 (not shown).
Referring to
In summary, the wearable augmented reality device 1 of this disclosure assists the golfer 9 during a golf play. By cooperation of the camera 11, the image recognition module 121, the image calculation module 122, the display 14 and the laser scanner 16, the wearable augmented reality device 1 is capable of displaying, on the display 14, a visual representation 143 of the club specifications, which is obtained from the identification 81 or the image feature of the club 8, the flight information 147 of the ball, and the putting suggestion.
In the description above, for the purposes of explanation, numerous specific details have been set forth in order to provide a thorough understanding of the embodiment. It will be apparent, however, to one skilled in the art, that one or more other embodiments may be practiced without some of these specific details. It should also be appreciated that reference throughout this specification to “one embodiment,” “an embodiment,” an embodiment with an indication of an ordinal number and so forth means that a particular feature, structure, or characteristic may be included in the practice of the disclosure. It should be further appreciated that in the description, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of various inventive aspects.
While the disclosure has been described in connection with what is considered the exemplary embodiment, it is understood that this disclosure is not limited to the disclosed embodiment but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.
| Number | Date | Country | Kind |
|---|---|---|---|
| 105129422 | Sep 2016 | TW | national |