Embodiments of the present invention generally relate to athletic activity heads up display systems and methods of using the same. More particularly, embodiments of the present invention relate to systems and methods for presenting computer-generated sensory input via a heads up display to an individual engaged in or observing an athletic activity, based on data related to the athletic activity.
Athletic activity is important to maintaining a healthy lifestyle and is a source of entertainment for many people. Some individuals prefer to engage in team athletic activities such as, for example, soccer or basketball, while other individuals prefer to engage in individual athletic activities such as, for example, running or skiing. Regardless of whether the activity is a team or individual activity, it is common for individuals to participate in both competitive sessions, such as a soccer match or a running race, and more informal training sessions such as conducting soccer drills or running interval sprints. Others who do not themselves regularly take part in athletic activities may nevertheless enjoy viewing athletic activities as a spectator.
Athletic activity monitoring systems exist that are capable of recording information about an individual's performance during an athletic activity using sensors. Some portable fitness monitoring systems employ sensors attached to the individual's body, while other portable fitness monitoring systems rely on sensors attached to a piece of athletic equipment. Such sensors may be capable of measuring various physical and/or physiological parameters associated with the individual's athletic activity.
Technology has resulted in the development of so-called “heads up displays” (HUDs) for presenting visual information to a user without requiring the user to look away from their usual viewpoint. The origin of the term stems from airplane pilots being able to view information (e.g. flight status or plane orientation information) with their heads still positioned “up” and looking forward, instead of angled down looking at lower instruments. HUDs typically include a projector for projecting information onto a transparent surface, such as a glass plate, that enables the background environment of the user's typical viewpoint to still be seen. Although HUDs were initially developed for military aviation, they can now be found in commercial aircraft, automobiles, and computer gaming applications.
HUDs can be a useful tool for a variety of “augmented reality” applications. The basic idea of augmented reality is to present computer-generated sensory input to a user—superimposed graphics, audio, haptic feedback, or other sensory enhancements—that provide information about the environment and its objects in the context of a real-world environment. For example, fans of American football have become accustomed in recent years to the presence a super-imposed “first-down” line on televised American football games.
An individual engaged in an athletic activity—or an interested observer such as a coach or fan—may desire to receive information about the athletic activity, including information about the individual's performance. But with respect to providing this information, existing athletic activity monitoring systems suffer from a number of drawbacks. Many existing systems do not provide the individual or interested observer with information about the athletic activity until after the activity has been completed. Other systems may present the information about the athletic activity during the activity, but in a way that distracts that individual or interested observer from focusing on the ongoing athletic activity itself. And many existing HUDs and augmented reality systems are not portable and are therefore not suitable for monitoring in many real world athletic competitive or training sessions. Finally, existing athletic activity monitoring systems often fail to provide the individual or interested observer with quick, accurate, insightful information that would enable them to easily compare past performances, develop strategies for improving future performances, or visualize performances.
What is needed are athletic activity heads up display systems and methods having improved capabilities over existing athletic activity monitoring systems, thus offering individuals engaged in athletic activities and other interested observers better tools to assess these activities. At least some of the embodiments of the present invention satisfy the above needs and provide further related advantages as will be made apparent by the description that follows.
Embodiments of the present invention relate to a method of using an athletic activity heads up display system during an athletic activity, including the steps of a heads up display unit receiving information about a sport ball, and the heads up display unit displaying an image to an individual based on the information, where the image is overlaid on the individual's present field of view of an environment.
Embodiments of the present invention also relate to a method of using an athletic activity heads up display system during an athletic activity, including the steps of a heads up display unit recognizing the presence of a sport ball in the heads up display unit's projected image field, and the heads up display unit displaying an image to an individual overlaid on the individual's present field of view of an environment, where the displayed image indicates an objective for the individual.
Embodiments of the present invention further relate to a method of using an athletic activity heads up display system during an athletic activity, including the steps of a heads up display unit determining a a position of a part of the body of an individual wearing the heads up display during an athletic movement, and the heads up display unit displaying visual feedback to the individual about the position of the part of the body during the athletic movement.
Additional features of embodiments of the invention will be set forth in the description that follows, and in part will be apparent from the description, or may be learned by practice of the invention. Both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
The accompanying figures, which are incorporated herein, form part of the specification and illustrate embodiments of the present invention. Together with the description, the figures further serve to explain the principles of and to enable a person skilled in the relevant arts to make and use the invention.
The present invention will now be described in detail with reference to embodiments thereof as illustrated in the accompanying drawings. References to “one embodiment”, “an embodiment”, “an example embodiment”, “some embodiments”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
The term “invention” or “present invention” as used herein is a non-limiting term and is not intended to refer to any single embodiment of the particular invention but encompasses all possible embodiments as described in the application.
Various aspects of the present invention, or any parts or functions thereof, may be implemented using hardware, software, firmware, non-transitory tangible computer readable or computer usable storage media having instructions stored thereon, or a combination thereof, and may be implemented in one or more computer systems or other processing systems.
The present invention generally relates to athletic activity “heads up display” (HUD) systems and methods of using the same during athletic activities. More particularly, embodiments of the present invention relate to systems and methods for presenting computer-generated sensory input via a HUD to an individual engaged in or observing an athletic activity, based on data related to the athletic activity.
An individual engaged in an athletic activity or an interested observer, such as a coach or fan, may desire to receive information about an athletic activity, including information about the individual's performance during the athletic activity. Existing athletic activity monitoring systems often disadvantageously present such information in a way that distracts that individual or interested observer from focusing on the ongoing athletic activity itself.
For example, when an individual engaged in an athletic activity must look away from the field of activity in order to view a visual output (e.g. a visual indicator of the individual's heart rate or the speed of a ball kicked by the individual), their level of attention, awareness, and concentration on the athletic activity is necessarily reduced. In the context of an ongoing competitive session (e.g. a soccer match), this risks a drop in the individual's performance and possibly risks injury to a distracted individual. Even in the context of a more informal non-competitive training session (e.g. conducting soccer drills), the distraction from having to look away from the individual's usual viewpoint can slowdown and therefore lengthen the training session and pull the individual out of their training “flow,” possibly reducing the overall effectiveness of the training session.
Other interested individuals such as coaches or athletic activity spectators may be similarly disadvantaged by having to look away from the field of activity in order to view a visual output related to the athlete's performance.
In contrast, the athletic activity HUD systems and methods described herein aim to provide the individual or interested observer with quick, accurate, insightful information to enable them to easily and intuitively obtain information about the athlete's performance in a non-distracting manner, easily compare past performances, develop strategies for improving future performances, and/or visualize aspects of performances.
The individual 10 depicted in
The individual 10 depicted in
Athletic activity HUDs 100 according to embodiments of the invention may superimpose graphics, audio, and other sensory enhancements over a real-world environment, possibly in real time. Athletic activity HUDs 100 may incorporate display technology that is capable of projecting one or more images in a way that makes them visible and overlaid on top of the individual's 10 “real world” view of their environment.
The depicted projected image field 400 of
In other embodiments of the present invention, the athletic activity HUD may include a projector for actually projecting information substantially further in front of the individual 10 onto other surfaces that effectively turn these surfaces into display surfaces. For example, the athletic activity HUD may project information onto the ground in front of the individual effectively turning the ground into a display surface.
Regardless of the nature of their projector configuration, Athletic activity HUDs 100 according to embodiments of the invention can be used to convey a wide variety of information to the individual 10 engaged in an athletic activity. For example, in some embodiments, an athletic activity HUD 100 may display general information to the individual such as the date, time, temperature, or location. This information may be conveyed, for example, as text visually overlaid in the projected image field 400.
The athletic activity HUD 100 may also display tips or guidance to the individual 10 such as tips or guidance on what to expect in their activity or sport, how they should attempt to perform a particular athletic maneuver, or what their objective should be for the activity or sport. In some embodiments, the tips or guidance may be conveyed, for example, as text visually overlaid in the projected image field 400.
In other embodiments, the tips or guidance may take the form of icons or other visual indicia such as a target icon 402 or an obstacle icon 404.
In some embodiments, an athletic activity HUD 100 may be used to display athletic activity results or coaching based on data obtained during the athletic activity.
In the scenario where the individual 10 had just kicked a soccer ball 20 toward a soccer goal 30, the flight path icon 406 may be generated based on data obtained during the athletic activity, as explained in further detail below. In other words, the particular flight path depicted by the flight path icon 406 may be representative of an actual soccer ball 20 flight path from a soccer ball 20 kicked by the individual 10. Similarly, the feedback icon 408, which displays text indicating that the soccer ball's 20 speed was 45 miles per hour and that the soccer ball 20 was spinning at 120 revolutions per minute, may also be representative of the motion characteristics of an actual soccer ball 20 kicked by the individual 10.
In other embodiments, the athletic activity HUD 100 may be used to display images like those shown in
The processor 110 may be adapted to implement application programs stored in the memory 114 of the HUD 100. The processor 110 may also be capable of implementing analog or digital signal processing algorithms such as raw data reduction and filtering. For example, processor 110 may be configured to receive raw data from sensors and process such data at the HUD 100. The processor 110 may be operatively connected to the power source 112, the memory 114, and the output module 116.
The power source 112 may be adapted to provide power to the HUD 100. In one embodiment, the power source 112 may be a battery. The power source may be built into the HUD 100 or removable from the HUD 100, and may be rechargeable or non-rechargeable. In one embodiment, the HUD 100 may be repowered by replacing one power source 112 with another power source 112. In another embodiment, the power source 112 may be recharged by a cable attached to a charging source, such as a universal serial bus (“USB”) FireWire, Ethernet, Thunderbolt, or headphone cable, attached to a personal computer. In yet another embodiment, the power source 112 may be recharged by inductive charging, wherein an electromagnetic field is used to transfer energy from an inductive charger to the power source 112 when the two are brought in close proximity, but need not be plugged into one another via a cable. In some embodiment, a docking station may be used to facilitate charging.
The memory 114 of an exemplary HUD 100 may be adapted to store application program instructions and to store athletic activity data. In an embodiment, the memory 114 may store application programs used to implement aspects of the functionality of the athletic activity HUD 100 systems and methods described herein. In one embodiment, the memory 114 may store raw data, recorded data, and/or calculated data. In some embodiments, as explained in further detail below, the memory 114 may act as a data storage buffer. The memory 114 may include both read only memory and random access memory, and may further include memory cards or other removable storage devices.
In some embodiments of the present invention, the memory 114 may store raw data, recorded data, and/or calculated data permanently, while in other embodiments the memory 114 may only store all or some data temporarily, such as in a buffer. In one embodiment of the present invention, the memory 114, and/or a buffer related thereto, may store data in memory locations of predetermined size such that only a certain quantity of data may be saved for a particular application of the present invention.
The output module 116 of an exemplary HUD 100 may be a visual display output module 116 integrally coupled to the HUD 100. The visual display output module 116 may be used to visually display information to the individual 10. In an embodiment, the visual display output module 116 may include, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, or an organic light-emitting diode (OLED) display. In one embodiment, the output module 116 may be adapted to provide audio (e.g. via a speaker) and/or tactile (e.g. vibration) output in addition to or instead of visual display output.
The projector 130 may be an optical collimator setup that includes a convex lens or concave mirror with a Cathode Ray Tube, light emitting diode, or liquid crystal display at its focus. This setup may produce an image where the light is “parallel” i.e. perceived to be at infinity. The processor 110 may provide the interface between the rest of the HUD 100 and the projector 130 such that the processor 110 generates the imagery to be displayed by the projector 130.
The combiner 132 may be an angled flat piece of glass (i.e. a beam splitter) located directly in front of the individual's eyes or field of vision that redirects the projected image from projector 130 in such a way as to be able to see the field of view (i.e. the background environment) and the projected infinity image at the same time. The combiner 132 may have a special coating that reflects light projected onto it from the projector 130 unit while allowing other light to pass through. In some embodiments, the combiner 132 may also have a curved surface to refocus the image from the projector 130.
The screen 134 may be a separate surface, such as a transparent or semi-transparent glass plate, that enables the background environment of the individual's 10 typical viewpoint to still be seen. In other words, the screen 134 may provide the basis for the individual to view the effective projected image field 400 in their field of view. In some embodiments, the combiner 132 and the screen 134 may be a single piece of hardware capable of serving the functions of both the combiner 132 and the screen 134.
The video camera 118 may be any suitable video recording component, such as a digital video camera incorporated into the HUD 100. In some embodiments, the video camera 118 may be configured and oriented such that the video camera 118 may record a “first person” view of what the individual wearing the HUD 100 sees while engaged in their athletic activity. Any video recorded may be played back to the individual 10 or other interested individual in real time or after the recorded event. In some embodiments, the HUD may playback a recorded video that is annotated with other visual images, such as the images discussed throughout as being potentially displayed in the projected image field 400. In one embodiment, first person recorded video may be played back to the individual 10 wearing the HUD 100 such that the first person recorded video takes up the individual's 10 entire field of view. In this sense, the first person recorded video may provide a “virtual reality”-like experience to the individual 10. In another embodiment, the first person recorded video may be played back to the individual 10 wearing the HUD 100 such that the first person recorded video takes up less than the individual's 10 entire field of view. For example, the first person recorded video may be displayed in one corner of the individual's 10 field of view so that the individual 10 may view the video or shift their attention to other real world items in their field of view.
Incorporating a video camera 118 into an embodiment of an athletic activity HUD 100 may provide the athletic activity HUD 100 with additional capabilities. For example, athletic activity HUDs 100 according to embodiments of the invention incorporating a video camera 118 may be used to determine the orientation of the individual's 10 body to the ball 20, determine the orientation of the individual's 10 foot 12 to the ball 20, determine the orientation of the ball 20 to the goal 30 (i.e. determine when a goal is scored based on a recorded video image of the ball 20 entering the goal 30), determine the rotation rate or speed of the ball 20, determine the number of ball 20 touches, or even trigger the taking of a still photo or video clip by the HUD 100 based on a determination of foot 12—ball 30 impact.
In embodiments of the present invention where a video camera 118 is used to determine the relative orientation of the individual's 10 body or motion of a ball 20, certain optical motion tracking features may be employed. In some embodiments, video of the movements of one or more of a portion of the individual's 10 body or of a ball 20 are sampled many times per second. This resulting video data can be mapped in a three dimensional model so that the model can replicate the motion of the portion of the individual's 10 body or of the ball 20.
In other embodiments, the optical motion tracking features can be used to determine whether or not the recorded video image data contains some specific object, feature, or activity (e.g. a ball 20 or a ball being kicked into a goal 30). This can be accomplished, for example, by identifying a specific object (e.g. a ball 20) in specific situations that may be described in terms of well-defined illumination, background, and position of the object relative to the camera. For example, a ball 20 being kicked into a goal 30 could exhibit certain visual patterns of illumination, background, and position.
Some optical motion tracking features may rely on the correspondence of image features between consecutive frames of video, taking into consideration information such as position, color, shape, and texture. Edge detection can be performed by comparing the color and/or contrast of adjacent pixels, looking specifically for discontinuities or rapid changes.
In some embodiments, certain soccer techniques, drills, or types of contacts with a ball 20 may be distinguished from one another based on optical motion tracking data derived from a video camera 118. Dribbling involves the individual 10 running with the ball 20 near their feet under close control. Passing involves the individual 10 kicking (or otherwise permissibly contacting) the ball toward one of their teammates. Juggling is a drill that involves the individual 10 kicking (or otherwise permissibly contacting) the ball 20 repeatedly in the air without letting the ball 20 hit the ground between kicks. In one embodiment of the present invention, recorded video data from a video camera 118 may be used, for example, to identify whether a particular contact of a ball 20 involved, for example, a dribble, pass, or juggle based on the recorded video data. As explained in further detail below, while ball 20 impact data may be obtained using a smart sport ball 300, in some embodiments, the smart sport ball 300 may not be capable of determining the specific types of contacts with the smart sport ball 300 without additional data, such as video data.
The microphone 120 may be any suitable audio recording component. The microphone 120 may be used to receive voice commands from the individual 10 to control the HUD 100, to receive voice notes, to conduct telephone calls of other communications, or to capture other relevant sounds during the course of an athletic activity such as the sound of the athlete's 10 foot 12 contacting a soccer ball 20 during a kick, the sound of a coach calling out an instruction to the individual 10, or the sound of a referee's whistle blowing.
Incorporating a microphone 120 into an embodiment of an athletic activity HUD 100 may provide the athletic activity HUD 100 with additional capabilities. For example, athletic activity HUDs 100 according to embodiments of the invention incorporating a microphone 120 may be used to determine the number of ball 20 touches or to determine how “clean” a kick was based on the sound profile.
In some embodiments, the HUD 100 may incorporate one or more other additional sensors 122. For example, in some embodiments, the HUD may include one or more of an acceleration sensor, a heart rate sensor, a magnetic field sensor, a thermometer, an angular momentum sensor (e.g., a gyroscope), and a positioning system receiver. In other embodiments, one or more of these sensors 122 may be omitted, or one or more additional sensors may be added.
The acceleration sensor may be adapted to measure the acceleration of the HUD 100. Accordingly, when the HUD 100 is physically coupled to the individual 10, the acceleration sensor may be capable of measuring the acceleration of the individual's 10 body or portion of the individual's 10 body that the HUD 100 is coupled to, including the acceleration due to the Earth's gravitational field. In one embodiment, the acceleration sensor may include a tri-axial accelerometer that is capable of measuring acceleration in three orthogonal directions. In other embodiments one, two, three, or more separate accelerometers may be used.
A magnetic field sensor may be adapted to measure the strength and direction of magnetic fields in the vicinity of the HUD 100. Accordingly, when the HUD 100 is physically coupled to the individual 10, the magnetic field sensor may be capable of measuring the strength and direction of magnetic fields in the vicinity of the the individual 10, including the Earth's magnetic field. In one embodiment, the magnetic field sensor may be a vector magnetometer. In other embodiments, the magnetic field sensor may be a tri-axial magnetometer that is capable of measuring the magnitude and direction of a resultant magnetic vector for the total local magnetic field in three dimensions. In other embodiments one, two, three, or more separate magnetometers may be used.
In one embodiment, the acceleration sensor and the magnetic field sensor may be contained within a single accelerometer-magnetometer module. In other embodiments, the HUD 100 may include only one of the acceleration sensor and the magnetic field sensor, and may omit the other if desired.
Exemplary magnetic field sensor systems having certain athletic performance monitoring features that may be advantageously used in concert with the athletic activity HUD 100 systems and methods of the present invention, are disclosed in commonly owned U.S. patent application Ser. No. 13/797,361, filed Mar. 12, 2013, the entirety of which is incorporated herein by reference thereto.
In addition to the acceleration sensor and the magnetic field sensor, other sensors that may be part of the HUD 100 or separate from but in communication with the HUD 100 may include sensors capable of measuring a variety of athletic performance parameters. The term “performance parameters” may include physical parameters and/or physiological parameters associated with the individual's 10 athletic activity. Physical parameters measured may include, but are not limited to, time, distance, speed, pace, pedal count, wheel rotation count, rotation generally, stride count, stride length, airtime, stride rate, altitude, strain, impact force, jump force, force generally, and jump height. Physiological parameters measured may include, but are not limited to, heart rate, respiration rate, blood oxygen level, blood lactate level, blood flow, hydration level, calories burned, or body temperature.
An angular momentum sensor, which may be, for example, a gyroscope, may be adapted to measure the angular momentum or orientation of the HUD 100. Accordingly, when the HUD 100 is physically coupled to the individual 10, the angular momentum sensor may be capable of measuring the angular momentum or orientation of the individual 10 or portion of the individual's 10 body that the HUD 100 is coupled to. In one embodiment, the angular momentum sensor may be a tri-axial gyroscope that is capable of measuring angular rotation about three orthogonal axis. In other embodiments one, two, three, or more separate gyroscopes may be used. In an embodiment, the angular momentum sensor may be used to calibrate measurements made by one or more of an acceleration sensor and a magnetic field sensor.
A heart rate sensor may be adapted to measure an individual's 10 heart rate. The heart rate sensor may be placed in contact with the individual's 10 skin, such as the skin of the individual's 10 chest, wrist, or head. The heart rate sensor may be capable of reading the electrical activity the individual's 10 heart.
A temperature sensor may be, for example, a thermometer, a thermistor, or a thermocouple that measures changes in the temperature. In some embodiments, the temperature sensor may primarily be used for calibration other sensors of the athletic activity HUD 100 system, such as, for example, the acceleration sensor and the magnetic field sensor.
In one embodiment, the athletic activity HUD 100 may include a positioning system receiver. The positioning system receiver may be an electronic satellite position receiver that is capable of determining its location (i.e., longitude, latitude, and altitude) using time signals transmitted along a line-of-sight by radio from satellite position system satellites. Known satellite position systems include the GPS system, the Galileo system, the BeiDou system, and the GLONASS system. In another embodiment, the positioning system receiver may be an antennae that is capable of communicating with local or remote base stations or radio transmission transceivers such that the location of the HUD 100 may be determined using radio signal triangulation or other similar principles.
In yet another embodiment, the positioning system may operate based upon magnetometer readings, such as the detection of local magnetic fields, as described in commonly owned U.S. patent application Ser. No. 13/797,361, filed Mar. 12, 2013. In one embodiment, an acceleration sensor can be used in conjunction with a magnetometer and gyroscope in order to calibrate motion and position determinations. For example, information indicative of change in motion, gravity, and change in direction can be obtained using the acceleration sensor. Angular movement can be obtained using the gyroscope, and the absolute “North” orientation or local magnetic field data, such as magnetic field intensity and/or direction, can be obtained using the magnetometer. These sensor readings can be used to determine, for example, the posture of an individual 10, gravity, position and orientation of individual 10 in space, and heading of individual 10.
In some embodiments, positioning system receiver data may allow the HUD 100 to detect information that may be used to measure and/or calculate position waypoints, time, location, distance traveled, speed, pace, or altitude.
Incorporating one or more of the above recited sensors into an embodiment of an athletic activity HUD 100 may provide the athletic activity HUD 100 with additional capabilities. For example, athletic activity HUDs 100 according to embodiments of the invention incorporating one or more of an acceleration sensor, a magnetic field sensor, an angular momentum sensor, and a positioning system receiver may be used to determine parameters such as the individual's step count, approach speed, body alignment, and form, as explained in further detail below. In other embodiments, the HUD 100 may determine which player out of a group of players kicked a ball 20 at a given time based on pattern matching based on a pattern generated from particular individual 10 sensor data profiles.
In other embodiments, the athletic activity HUD 100 may leverage the capabilities of a microphone 120 and an output module 116 taking the form of a speaker to allow for audio communication between multiple individuals 10 using athletic activity HUDs 100, or between an individual using an athletic activity HUD 100 and an individual using another communication device such as a telephone or a two way radio transceiver. In one embodiment, the athletic activity HUD 100 includes hardware suitable for communicating via a cellular communications network.
While the above description has focused on the various exemplary components and functionalities of an athletic activity HUD 100 in isolation, in other embodiments, one or more of the above recited components may instead or additionally be present in other devices that may be connected to the HUD 100 via a network to operate in concert with the HUD 100 to provide improved capabilities and functionalities for the individual 10 using the HUD 100.
Along these lines,
In some embodiments of the present invention, the HUD 100 may communicate with one or more other components illustrated in
Wired communication between the HUD 100 and a personal computer (or other computing device) may be achieved, for example, by placing the HUD 100 in a docking unit that is attached to the personal computer using a communications wire plugged into a communications port of the personal computer. In another embodiment, wired communication between the sensor HUD 100 and the personal computer may be achieved, for example, by connecting a cable between the HUD 100 and the computer. The cable connecting the HUD 100 and the computer may be a USB cable with suitable USB plugs including, but not limited to, USB-A or USB-B regular, mini, or micro plugs, or other suitable cable such as, for example, a FireWire, Ethernet or Thunderbolt cable.
Wired connection to a personal computer or other computing device may be useful, for example, to upload athletic activity information from the HUD 100 to the personal computer or other computing device, or to download application software updates or settings from the personal computer or other computing device to the HUD 100.
Wireless communication between HUD 100 and the personal computer or other computing device may be achieved, for example, by way of a wireless wide area network (such as, for example, the Internet), a wireless local area network, or a wireless personal area network. As is well known to those skilled in the art, there are a number of known standard and proprietary protocols that are suitable for implementing wireless area networks (e.g., TCP/IP, IEEE 802.16, Bluetooth, Bluetooth low energy, ANT, ANT+ by Dynastream Innovations, or BlueRobin). Accordingly, embodiments of the present invention are not limited to using any particular protocol to communicate between the HUD 100 and the various elements depicted in
In one embodiment, the HUD 100 may communicate with a wireless wide area network communications system such as that employed by mobile telephones. For example, a wireless wide area network communication system may include a plurality of geographically distributed communication towers and base station systems. Communication towers may include one or more antennae supporting long-range two-way radio frequency communication wireless devices, such as HUD 100. The radio frequency communication between antennae and the HUD 100 may utilize radio frequency signals conforming to any known or future developed wireless protocol, for example, CDMA, GSM, EDGE, 3G, 4G, IEEE 802.x (e.g., IEEE 802.16 (WiMAX)), etc. The information transmitted over-the-air by the base station systems and the cellular communication towers to the HUD 100 may be further transmitted to or received from one or more additional circuit-switched or packet-switched communication networks, including, for example, the Internet.
As shown in
A variety of information may be communicated between any of the network 200 may also be employed for communication between any two or more of the HUD 100, server 202, a mobile phone 204, a body-mounted device 206, a ball-mounted device 208, and/or a coach device 210, or other electronic components such as, for example, another HUD 100. Such information may include, for example, athletic performance information, device settings (including HUD 100 settings), software, and firmware.
Communication among these various elements may occur after the athletic activity has been completed or in real-time during the athletic activity.
Exemplary portable electronic device components, including mobile phones 204, having certain athletic performance monitoring features that may be advantageously used in concert with the athletic activity HUD 100 systems and methods of the present invention, are disclosed in commonly owned U.S. patent application Ser. No. 12/836,421, filed Jul. 14, 2010 (which published as U.S. Patent App. Pub. No. 2012/0015779), and commonly owned U.S. patent application Ser. No. 13/080,322, filed Apr. 5, 2011 (which published as U.S. Patent App. Pub. No. 2012/0258433) the entireties of which are incorporated herein by reference thereto.
Exemplary body-mounted devices 206 and/or ball-mounted devices 208, having certain athletic performance monitoring features that may be advantageously used in concert with the athletic activity HUD 100 systems and methods of the present invention, are disclosed in commonly owned U.S. patent application Ser. No. 13/446,937, filed Apr. 13, 2012 (which published as U.S. Patent App. Pub. No. 2013/0274635), commonly owned U.S. patent application Ser. No. 13/446,982, filed Apr. 13, 2012 (which published as U.S. Patent App. Pub. No. 2013/0274040), commonly owned U.S. patent application Ser. No. 13/797,274, filed Mar. 12, 2013 (which published as U.S. Patent App. Pub. No. 2013/0274587), and commonly owned U.S. patent application Ser. No. 14/120,272, filed May 14, 2014, the entireties of which are incorporated herein by reference thereto.
Exemplary coach devices 210 having certain athletic performance monitoring features that may be advantageously used in concert with the athletic activity HUD 100 systems and methods of the present invention are disclosed in commonly owned U.S. patent application Ser. No. 13/543,428, filed Jul. 6, 2012 (which published as U.S. Patent App. Pub. No. 2013-0041590), the entirety of which is incorporated herein by reference thereto.
More detailed examples of embodiments of the present invention that may utilize an athletic activity HUD 100—alone or in combination with one or more of a mobile phone 204, a body-mounted device 206, a ball-mounted device 208, and/or a coach device 210—the to provide athletic activity monitoring to individuals 10 engaged in athletic activities or other interested persons are provided below.
In some embodiments, in addition to being displayed to the individual 10 in an overlaid fashion in the projected image field 400 over the real world background of a soccer ball 20 and soccer goal 30 that the individual is looking at while wearing their HUD 100, the information displayed may also be recorded and/or transmitted to other devices such as a server 202, a mobile phone 204, a body-mounted device 206, a ball-mounted device 208, and/or a coach device 210 for storage, further processing, or display.
In embodiments where the individual is utilizing both a HUD 100 and a mobile phone 204 or similar device such as a smart watch, the information may be stored, processed, or displayed by the phone 204 or similar device.
In embodiments where the individual is utilizing both a HUD 100 and a mobile phone 204 or similar device such as a smart watch, this information may be stored, processed, or displayed by the phone 204 or similar device.
When an individual 10 kicks a soccer ball 20, the impact typically results in deformation of both the individual's 10 foot 12 (or rather, the footwear covering their foot 12) and of the soccer ball 20. Deformation of the soccer ball 20 is more significant and visually apparent—a surface of the soccer ball 20 typically becomes indented or flattened in response to a sufficiently swift kick. The “strike zone” represented by the strike zone icon 414 shown in
In the embodiment of
In embodiments where the individual is utilizing both a HUD 100 and a mobile phone 204 or similar device such as a smart watch, this information may be stored, processed, or displayed by the phone 204 or similar device.
In addition, in embodiments where the individual is utilizing a HUD 100 that incorporates one or more of the above recited sensors, such as an acceleration sensor, a magnetic field sensor, an angular momentum sensor, or a positioning system receiver, the HUD 100 may be used to determine parameters such as the individual's step count, approach speed, body alignment, and form.
Similarly, the individual's 10 approach speed, such as when running towards a ball 20 to kick the ball 20, may be determined by sensing motion of the HUD 100. In embodiments where the individual 10 is also carrying a mobile phone 204, speed may be determined by the HUD 100, the mobile phone 204 carried by the individual 10, or both.
In one embodiment, a HUD 100 projected image field 400 may include footstep icons 418 or other visual indicia that indicate to the individual 10 where they should step and with which foot (left or right) to perform specific soccer movements or drills. For example, a projected image field 400 could include a real world view of a soccer ball 20 and a soccer goal 30 laid out across a soccer field, and further include footstep icons 418 overlaid on the real world soccer field. In an embodiment, the HUD 100 projected image field 400 may include three footstep icons 418 leading toward the real world view of a soccer ball 20 indicating to the individual 10 that they should step with their left foot, their right foot, and there left foot once more before striking the ball 20 with their right foot in order to execute a desired maneuver (e.g. kick the ball 20 toward the goal 30, pass the ball, etc.). In some embodiments, another footstep icon 418 may appear on or near the ball 20 indicating that the individual 10 should kick the ball 20 there, while in other embodiments a different icon or other visual indicia may appear on or near the ball 20 indicating that the individual 10 should kick the ball 20 there.
In another embodiment of the present invention, footstep icons 418 may be presented in a manner that conveys to the individual 10 a desired speed, timing, or cadence with which they should approach the ball. For example, the footstep icons 418 may be displayed one after another in series at specific times, such as one footstep every half second, or the second footstep a half a second after the first and the third footstep a quarter of a second after the third. In this way, the footstep icons 418 may be displayed so as to provide an animation of where and when the individual's steps approaching the ball 20 for a soccer maneuver should occur.
In some embodiments, the individual may be give specific drills to improve their skills. For example, the individual might be given coaching to bounce a ball 20 off a wall with the same cadence as that presented in the projected image field 400, via a speaker, or based on a transmitted vibration of the HUD 100. As explained in detail previously, in one embodiment of the present invention, recorded video data from a video camera 118 may be used, for example, to identify whether a particular contact of a ball 20 involved, for example, a dribble, pass, or juggle based on the recorded video data. Likewise, recorded video data can be used to determine if an individual 10 conducting drills has made good contact with the ball 20, or if the ball merely dropped or hit the ground.
Another popular and effective soccer drill is known as “wall ball.” In one embodiment, this soccer skill building game can be enhanced using the HUD 100 according to embodiments of the present invention. For example, the individual 10 wearing the HUD 100 may be given a target on the wall in the projected image field 400 to strike with their real world ball 20. The software of the HUD 100 may be adapted to constantly vary the target area to force the individual 10 to react to the ball 20 as it strikes the wall at different angles. Such a system could build in a timing or cadence aspect to the game to help improve reaction time and moderating strike force.
In another embodiment, the HUD 100 could provide via its projected image field 400 the effect of a “virtual target” beyond a real world wall in front of the individual 10. For example, an individual 10 stationed in front of a wall may be able to kick a ball 20 off a wall while having the HUD 100 display in its projected image field 400 the virtual flight path the ball 20 would have taken had the individual 10 not been kicking directly into a wall. In this sense, the individual could achieve an experience somewhat similar to what is experienced by golfers hitting golf balls into projection screens on virtual courses, but without the constraints included in such systems. In this way, individuals 10 could practice making long outlet passes with moving targets (i.e. visually displayed via projected image field 400) without the hassle of being forced to retrieve the ball 20 from a great distance.
Providing the individual 10 with guidance on a desired footstep location and/or timing may enable the individual 10 better perform specific kicks or other maneuvers, become accustomed to a proper “pace” of conducting specific maneuvers, or improve the individual's 10 form for such maneuvers.
In one embodiment of the present invention, the individual's 10 HUD 100 projected image field 400 may include other types of guidance as to where and how they should move. For example, the individual 10 could be presented with a first person representation of desired motions (e.g. leg, foot, arm, and/or hand motions) that they should try to match to acheive a certain objective. In an embodiment, the individual 10 could be presented with a first person representation of how their feet may have to move to dribble a ball 20 with their feet in a certain manner. In some embodiments the individual's 10 HUD 100 projected image field 400 may include virtual obstacles such as defenders that the individual would have to dribble around. The individual 10 could be visually coached to dribble around or juke past the virtual defender. In other embodiments, the coaching provided by in the projected image field 400 may instead illustrate how the individual's 10 feet may have to move to dribble a ball 20 with their feet in a certain manner by illustrating an exemplary player conduction the motions (e.g. not a first person representation).
Finally, an individual's body alignment or form may be determined by sensing the orientation of one or more body parts of the individual based on acceleration sensor, magnetic field sensor, and/or angular momentum sensor data from the HUD 100. As illustrated in
The features of embodiments of a HUD 100 system described above where the HUD 100 incorporates one or more of the above recited sensors, such as an acceleration sensor, a magnetic field sensor, an angular momentum sensor, or a positioning system receiver, may additionally or alternatively be provided in including a HUD 100 as well as a body-mounted device 206 including one or more of the above recited sensors, such as an acceleration sensor, a magnetic field sensor, an angular momentum sensor, or a positioning system receiver. In such a system, the body-mounted device 206 may be used to determine parameters such as the individual's step count, approach speed, body alignment, and form.
Specifically, in embodiments where the individual 10 is also wearing a body-mounted device 206, such as a chest body-mounted device 206, step count may be determined by sensing vibrations at the HUD 100, the body-mounted device 206 carried by the individual 10, or both.
Similarly, the individual's 10 approach speed, such as when running towards a ball 20 to kick the ball 20, may be determined by sensing motion of the HUD 100, the body-mounted device 206 carried by the individual 10, or both.
Finally, an individual's body alignment or form may be determined by sensing the orientation of one or more body parts of the individual based on acceleration sensor, magnetic field sensor, and/or angular momentum sensor data from the HUD 100. And in embodiments where the individual 10 is also wearing a body-mounted device 206, such as a chest body-mounted device 206, one or more of these vectors illustrated in
For example, a HUD mounted to the individual's 10 head and a chest body-mounted device 206 may provide discrete data sets for determining different orientation vectors, how they relate to one another, and how this signifies the individual's 10 body alignment. This information can be used to provide feedback or other coaching to the individual on how their alignment impacts their athletic performance, including the motion metrics of their kicks.
The processor 310 may be adapted to implement application programs stored in the memory 314 of the smart sport ball 300. The processor 310 may also be capable of implementing analog or digital signal processing algorithms such as raw data reduction and filtering. For example, processor 310 may be configured to receive raw data from sensors and process such data at the smart sport ball 300. The processor 310 may be operatively connected to the power source 312, the memory 314, the transceiver 318, and the acceleration sensor 316.
The power source 312 may be adapted to provide power to the smart sport ball 300. In one embodiment, the power source 312 may be a battery. The power source may be built into the smart sport ball 300 or removable from the smart sport ball 300, and may be rechargeable or non-rechargeable. In one embodiment, the smart sport ball 300 may be repowered by replacing one power source 312 with another power source 312. In another embodiment, the power source 312 may be recharged by a cable attached to a charging source, such as a universal serial bus (“USB”) FireWire, Ethernet, Thunderbolt, or headphone cable, attached to a personal computer. In yet another embodiment, the power source 312 may be recharged by inductive charging, wherein an electromagnetic field is used to transfer energy from an inductive charger to the power source 312 when the two are brought in close proximity, but need not be plugged into one another via a cable. In some embodiment, a docking station may be used to facilitate charging.
The memory 314 of an exemplary smart sport ball 300 may be adapted to store application program instructions and to store athletic activity data, such as motion data. In an embodiment, the memory 314 may store application programs used to implement aspects of the functionality of the systems and methods described herein. In one embodiment, the memory 314 may store raw data, recorded data, and/or calculated data. In some embodiments, as explained in further detail below, the memory 314 may act as a data storage buffer. The memory 314 may include both read only memory and random access memory, and may further include memory cards or other removable storage devices.
In some embodiments of the present invention, the memory 314 may store raw data, recorded data, and/or calculated data permanently, while in other embodiments the memory 314 may only store all or some data temporarily, such as in a buffer. In one embodiment of the present invention, the memory 314, and/or a buffer related thereto, may store data in memory locations of predetermined size such that only a certain quantity of data may be saved for a particular application of the present invention.
The transceiver 318 depicted in
In one embodiment, the 318 is a low-power transceiver. In some embodiments, the transceiver 318 may be a two-way communication transceiver 318, while in other embodiments the transceiver 318 may be a one-way transmitter or a one-way receiver.
The acceleration sensor 316 may be adapted to measure the acceleration of the smart sport ball 300, including the acceleration due to the Earth's gravitational field. In one embodiment, the acceleration sensor 316 may include a tri-axial accelerometer that is capable of measuring acceleration in three orthogonal directions. In other embodiments one, two, three, or more separate accelerometers may be used.
It is possible to use smart soccer balls 300, such as those described above, to obtain ball speed, ball spin rate, ball spin axis, and ball launch angle data using experimental methods. Based on the data obtained, given a suitably large and representative sample and suitably precise measurement techniques, and assuming the energy transfer between the foot 12 and smart soccer ball 300 depends solely on the inertial and elastic properties of smart soccer ball 300 (which are constant), it is also possible to undertake a multi-variable regression analysis to link ball speed, ball spin rate, ball spin axis, and ball launch angle data to points of impact location data for a smart soccer balls 300. This process is described in detail in commonly owned U.S. patent application Ser. No. 14/120,272, filed May 14, 2014.
In sum, as illustrated in
The exemplary projected image field 400 display of
Other visual displays giving feedback about the motion characteristics of the smart soccer ball 300 during their kick may also be provided to the individual 10. In one embodiment, as shown in
Accordingly, the methods previously described for analyzing the relationship between the location of the point of impact between an individual's 10 foot 12 and a smart soccer ball 300 and motion characteristics of the smart soccer ball 300 during and after it is kicked can be used to generate feedback to the individual 10 via the HUD 100, such as visually illustrating the location of a point of impact icon 612 overlaid on top of a ball icon 602, as shown in
In another embodiment, the HUD 100 may be capable of determining the location of the smart soccer ball 300 within the projected image field 400 and may visually display the point of impact icon 612 overlaid on top of the individual's 10 actual view of the smart soccer ball 300 falling within the projected image field 400, as opposed to overlaid on top of a ball icon 602. The location of the smart soccer ball 300 within the projected image field 400 may be determined, for example, using the video camera 118 of the HUD 100 and or by receiving data from the smart soccer ball 300 indicative of its location.
The exemplary projected image field 400 display of
The exemplary projected image field 400 display of
In one embodiment of the preset invention, the visually depicted spin rate of an animated spinning ball icon 602 may be equal to the calculated spin rate of the smart soccer ball 300. For example, if the calculated spin rate of the smart soccer ball 300 is three hundred revolutions per minute, the visually depicted spin rate of an animated spinning ball icon 602 may three hundred revolutions per minute. In another embodiment of the present invention, the visually depicted spin rate of an animated spinning ball icon 602 may be proportional to, but not equal to, the calculated spin rate of the smart soccer ball 300. For example, if the calculated spin rate of the smart soccer ball 300 is three hundred revolutions per minute, the visually depicted spin rate of an animated spinning ball icon 602 may half of that spin rate—or one hundred and fifty revolutions per minute. In still other embodiments, the visually depicted spin rate of an animated spinning ball icon 602 may not be correlated to the calculated spin rate of the smart soccer ball 300.
In another embodiment, the HUD 100 may be capable of determining the location of the smart soccer ball 300 within the projected image field 400 and may visually display the spinning ball icon 602 overlaid on top of the individual's 10 actual view of the smart soccer ball 300 falling within the projected image field 400, as opposed to overlaid on top of a ball icon 602.
In this way, the ball flight path, ball strike, and ball spin visual display and feedback features illustrated in
Embodiments of the present invention employing a HUD 100 in combination with a smart soccer ball 300 may provide additional benefits to the individual 10. For example, the locations of target icons 402, flight path icons 406, strike zone icons 414, point of impact icons 416, and footstep icons 418 that are displayed in the projected image field 400 may be determined, in whole or in part, based on data received from the smart soccer ball 300. In addition, parameters such as the individual's step count, approach speed, body alignment, and form described above as being calculated based on data from the HUD 100, a mobile phone 2014, and/or the body-mounted device 206 may further be based, in whole or in part, on data received from the smart soccer ball 300. For example, sensors in the smart soccer ball 300 may be used to sense vibrations associate with the individual taking steps toward the smart soccer ball 300 and eventually kicking the smart soccer ball 300.
Moreover, embodiments of the present invention employing a HUD 100 having a video camera 118 in combination with a smart soccer ball 300 may be able to determine the orientation of the smart soccer ball 300 to the goal 30—i.e. determine when a goal is scored based on a recorded video image of the smart soccer ball 300 entering the goal 30 and data from the smart soccer ball 300 based on the time of impact or other relevant data.
Embodiments of the present invention employing a HUD 100 having a microphone 120 in combination with a smart soccer ball 300 may be able to determine the number of smart soccer ball 300 touches or to determine how “clean” a kick was based on the sound profile as recorded by the microphone 120 and based on data from the smart soccer ball 300 based on the time of impact or other relevant data.
Embodiments of the present invention employing a HUD 100 in combination with a smart soccer ball 300 may further be able to determine which player among a group of players (e.g. members of a team) kicked the smart soccer ball 300 to the extent that the system may pair RFID (or similar ID technologies) data or other data associated with each individual 10 to each impact of the smart soccer ball 300.
In one embodiment of the present invention, the coach's 15 HUD 100 may present the coach 15 with informational icons 420 for items of interest falling within the coach's 15 projected image field 400. For example, as illustrated in
In another embodiment of the present invention, the coach's 15 HUD 100 may allow the coach 15 to communicate with one or more individuals 10 during an activity, such as players also equipped with HUDs 100. Communication may include audio (e.g. voice) communication and/or visual (e.g. text messages, images, or videos) communication. In one embodiment, the coach's 15 HUD 100 may present the coach 15 with a visual illustration of a desired play, formation, player route, or other visual information about desired player movement during the activity. And in some embodiments, the system could be adapted such that one or more players (e.g. all players, a team captain, or a player currently in possession of the ball 20) can simultaneously view the same visual information about desired player movement on their own HUD 100 that the coach 15 is viewing in his HUD 100. In this way, a coach 15 and his players can share information and strategize in real time during an athletic activity about desired player movements without having to huddle together during a stoppage of play (e.g. a timeout).
In one embodiment of the present invention, individuals 10 each having HUDs 100 playing over a wireless network could play networked games similar to how video games have evolved to enable remote play. For example, a virtual penalty kick mode could be enabled where an individual 10 in one location lines up to take a shot on goal while an individual 10 in another location lines up to block the shot. The first individual 10 would see a virtual representation of the other individual 10 that responds to that individual's 10 motion in their projected image field 400. The second individual 10 would see a virtual representation of the first individual 10 as well as virtual representation of a ball in their projected image field 400. In this example, both individuals 10 would be able to react to the others motions and the second individual 10 would be forced to react to the virtual ball launched at him.
In embodiments of the present invention employing a HUD 100 having a video camera 118 in combination with a smart soccer ball 300 may be able to determine the orientation of the smart soccer ball 300 to the goal 30—i.e. determine when a goal is scored based on a recorded video image of the smart soccer ball 300 entering the goal 30 and data from the smart soccer ball 300 based on the time of impact or other relevant data. The system may further be able to determine which player is in control of the ball or which player most recently kicked the ball based on recorded video data of the game play and data from the smart soccer ball 300 based on the time of impact or other relevant data. In one embodiment, the coach's 15 HUD may store data associated with individual players in association with a profile for that player for later analysis or sharing with the player.
Embodiments have been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
The foregoing description of the specific embodiments of the monitoring system described with reference to the figures will so fully reveal the general nature of the invention that others can, by applying knowledge within the skill of the art, readily modify and/or adapt for various applications such specific embodiments, without undue experimentation, without departing from the general concept of the present invention.
While various embodiments of the present invention have been described above, they have been presented by way of example only, and not limitation. It should be apparent that adaptations and modifications are intended to be within the meaning and range of equivalents of the disclosed embodiments, based on the teaching and guidance presented herein. It therefore will be apparent to one skilled in the art that various changes in form and detail can be made to the embodiments disclosed herein without departing from the spirit and scope of the present invention. The elements of the embodiments presented above are not necessarily mutually exclusive, but may be interchanged to meet various needs as would be appreciated by one of skill in the art.
It is to be understood that the phraseology or terminology used herein is for the purpose of description and not of limitation. The breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
The claims in the instant application are different than those of the parent application or other related applications. The Applicant therefore rescinds any disclaimer of claim scope made in the parent application or any predecessor application in relation to the instant application. The Examiner is therefore advised that any such previous disclaimer and the cited references that it was made to avoid, may need to be revisited. Further, the Examiner is also reminded that any disclaimer made in the instant application should not be read into or against the parent application.
The present application is a continuation of U.S. patent application Ser. No. 14/316,425, filed Jun. 26, 2014, which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4202350 | Walton | May 1980 | A |
4312358 | Barney | Jan 1982 | A |
4572197 | Moore et al. | Feb 1986 | A |
4580572 | Granek et al. | Apr 1986 | A |
4889131 | Salem et al. | Dec 1989 | A |
4909260 | Salem et al. | Mar 1990 | A |
4962469 | Ono et al. | Oct 1990 | A |
5007427 | Ramsey et al. | Apr 1991 | A |
5111818 | Ramsey et al. | May 1992 | A |
5148002 | Kuo et al. | Sep 1992 | A |
5153584 | Engira | Oct 1992 | A |
5204670 | Stinton | Apr 1993 | A |
5210540 | Masumoto | May 1993 | A |
5353793 | Bornn | Oct 1994 | A |
5400254 | Fujita | Mar 1995 | A |
5583776 | Levi et al. | Dec 1996 | A |
5592401 | Kramer | Jan 1997 | A |
5611085 | Rasmussen | Mar 1997 | A |
5724025 | Tavori | Mar 1998 | A |
5724265 | Hutchings | Mar 1998 | A |
5769755 | Henry et al. | Jun 1998 | A |
5802492 | DeLorme et al. | Sep 1998 | A |
5825327 | Krasner | Oct 1998 | A |
5899963 | Hutchings | May 1999 | A |
5947868 | Dugan | Sep 1999 | A |
5948040 | DeLorme et al. | Sep 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
6002982 | Fry | Dec 1999 | A |
6013007 | Root et al. | Jan 2000 | A |
6032108 | Seiple et al. | Feb 2000 | A |
6047203 | Sackner et al. | Apr 2000 | A |
6066093 | Kelly et al. | May 2000 | A |
6073086 | Marinelli | Jun 2000 | A |
6097345 | Walton | Aug 2000 | A |
6122340 | Darley et al. | Sep 2000 | A |
6135951 | Richardson et al. | Oct 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6148262 | Fry | Nov 2000 | A |
6148271 | Marinelli | Nov 2000 | A |
6151563 | Marinelli | Nov 2000 | A |
6157898 | Marinelli | Dec 2000 | A |
6198394 | Jacobsen et al. | Mar 2001 | B1 |
6204807 | Odagiri et al. | Mar 2001 | B1 |
6246362 | Tsubata et al. | Jun 2001 | B1 |
6254492 | Taggett | Jul 2001 | B1 |
6254551 | Vans | Jul 2001 | B1 |
6266623 | Vock et al. | Jul 2001 | B1 |
6298314 | Blackadar et al. | Oct 2001 | B1 |
6301964 | Fyfe et al. | Oct 2001 | B1 |
6305221 | Hutchings | Oct 2001 | B1 |
6336365 | Blackadar et al. | Jan 2002 | B1 |
6356856 | Damen et al. | Mar 2002 | B1 |
6357147 | Darley et al. | Mar 2002 | B1 |
6381482 | Jayaraman et al. | Apr 2002 | B1 |
6443890 | Schulze | Sep 2002 | B1 |
6463385 | Fry | Oct 2002 | B1 |
6478736 | Mault | Nov 2002 | B1 |
6493652 | Ohlenbusch et al. | Dec 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6513532 | Mault et al. | Feb 2003 | B2 |
6536139 | Darley et al. | Mar 2003 | B2 |
6551252 | Sackner et al. | Apr 2003 | B2 |
6582330 | Rehkemper et al. | Jun 2003 | B1 |
6585622 | Shum et al. | Jul 2003 | B1 |
6590536 | Walton | Jul 2003 | B1 |
6611789 | Darley | Aug 2003 | B1 |
6616613 | Goodman | Sep 2003 | B1 |
6626799 | Watterson et al. | Sep 2003 | B2 |
6716139 | Hosseinzadeh-Dolkhani et al. | Apr 2004 | B1 |
6736759 | Stubbs et al. | May 2004 | B1 |
6745069 | Nissila et al. | Jun 2004 | B2 |
6790178 | Mault et al. | Sep 2004 | B1 |
6798378 | Walters | Sep 2004 | B1 |
6832109 | Nissila | Dec 2004 | B2 |
6876947 | Darley et al. | Apr 2005 | B1 |
6882955 | Ohlenbusch et al. | Apr 2005 | B1 |
6885971 | Vock et al. | Apr 2005 | B2 |
6898550 | Blackadar et al. | May 2005 | B1 |
6959259 | Vock et al. | Oct 2005 | B2 |
6970731 | Jayaraman et al. | Nov 2005 | B1 |
7062225 | White | Jun 2006 | B2 |
7072789 | Vock et al. | Jul 2006 | B2 |
7092846 | Vock et al. | Aug 2006 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7187924 | Ohlenbusch et al. | Mar 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7220220 | Stubbs et al. | May 2007 | B2 |
7251454 | White | Jul 2007 | B2 |
7254516 | Case, Jr. et al. | Aug 2007 | B2 |
7273431 | DeVall | Sep 2007 | B2 |
7292867 | Werner et al. | Nov 2007 | B2 |
7428472 | Darley et al. | Sep 2008 | B2 |
7457724 | Vock et al. | Nov 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7480512 | Graham et al. | Jan 2009 | B2 |
7552031 | Vock et al. | Jun 2009 | B2 |
7559902 | Ting et al. | Jul 2009 | B2 |
7647196 | Kahn et al. | Jan 2010 | B2 |
7650257 | Alexander et al. | Jan 2010 | B2 |
7670263 | Ellis et al. | Mar 2010 | B2 |
7670295 | Sackner et al. | Mar 2010 | B2 |
7680523 | Rytky | Mar 2010 | B2 |
7689378 | Kolen | Mar 2010 | B2 |
7698830 | Townsend et al. | Apr 2010 | B2 |
7706815 | Graham et al. | Apr 2010 | B2 |
7715982 | Grenfell et al. | May 2010 | B2 |
7805149 | Werner et al. | Sep 2010 | B2 |
7805150 | Graham et al. | Sep 2010 | B2 |
7828641 | Imaeda | Nov 2010 | B2 |
7844415 | Bryant et al. | Nov 2010 | B1 |
7890291 | Godin et al. | Feb 2011 | B2 |
7891666 | Kuenzler et al. | Feb 2011 | B2 |
7980998 | Shemesh et al. | Jul 2011 | B2 |
8060337 | Kulach et al. | Nov 2011 | B2 |
8253586 | Matak | Aug 2012 | B1 |
8540560 | Crowley et al. | Sep 2013 | B2 |
8579632 | Crowley | Nov 2013 | B2 |
8818478 | Scheffler et al. | Aug 2014 | B2 |
9141759 | Burich et al. | Sep 2015 | B2 |
9257054 | Coza et al. | Feb 2016 | B2 |
9317660 | Burtch et al. | Apr 2016 | B2 |
9392941 | Powch et al. | Jul 2016 | B2 |
20020032386 | Sackner et al. | Mar 2002 | A1 |
20020068873 | Nissila | Jun 2002 | A1 |
20020107433 | Mault | Aug 2002 | A1 |
20020160883 | Dugan | Oct 2002 | A1 |
20030163287 | Vock et al. | Aug 2003 | A1 |
20030208409 | Mault | Nov 2003 | A1 |
20030224337 | Shum et al. | Dec 2003 | A1 |
20040046692 | Robson et al. | Mar 2004 | A1 |
20040102931 | Ellis et al. | May 2004 | A1 |
20040171956 | Babashan | Sep 2004 | A1 |
20040177531 | DiBenedetto et al. | Sep 2004 | A1 |
20040199056 | Husemann et al. | Oct 2004 | A1 |
20040209600 | Werner et al. | Oct 2004 | A1 |
20050010096 | Blackadar | Jan 2005 | A1 |
20050054941 | Ting et al. | Mar 2005 | A1 |
20050195094 | White | Sep 2005 | A1 |
20050197063 | White | Sep 2005 | A1 |
20050227811 | Shum et al. | Oct 2005 | A1 |
20050233815 | McCreary et al. | Oct 2005 | A1 |
20050250458 | Graham et al. | Nov 2005 | A1 |
20050266961 | Shum et al. | Dec 2005 | A1 |
20060020421 | Darley et al. | Jan 2006 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060135297 | Cruciani | Jun 2006 | A1 |
20060136173 | Case, Jr. et al. | Jun 2006 | A1 |
20060148594 | Saintoyant et al. | Jul 2006 | A1 |
20060189360 | White | Aug 2006 | A1 |
20060240865 | White | Oct 2006 | A1 |
20060246869 | Ohlenbusch et al. | Nov 2006 | A1 |
20070006489 | Case, Jr. et al. | Jan 2007 | A1 |
20070011919 | Case, Jr. | Jan 2007 | A1 |
20070021269 | Shum | Jan 2007 | A1 |
20070032318 | Nishimura et al. | Feb 2007 | A1 |
20070059675 | Kuenzler et al. | Mar 2007 | A1 |
20070060425 | Kuenzler et al. | Mar 2007 | A1 |
20070061105 | Darley et al. | Mar 2007 | A1 |
20070191083 | Kuenzler et al. | Aug 2007 | A1 |
20070203665 | Darley et al. | Aug 2007 | A1 |
20070208531 | Darley et al. | Sep 2007 | A1 |
20070247306 | Case | Oct 2007 | A1 |
20070287596 | Case et al. | Dec 2007 | A1 |
20080009275 | Werner et al. | Jan 2008 | A1 |
20080051993 | Graham et al. | Feb 2008 | A1 |
20080058971 | Graham et al. | Mar 2008 | A1 |
20080059064 | Werner et al. | Mar 2008 | A1 |
20080065319 | Graham et al. | Mar 2008 | A1 |
20080088303 | Englert | Apr 2008 | A1 |
20080103689 | Graham et al. | May 2008 | A1 |
20080125288 | Case | May 2008 | A1 |
20080201100 | Petrov | Aug 2008 | A1 |
20080274844 | Ward | Nov 2008 | A1 |
20080319661 | Werner et al. | Dec 2008 | A1 |
20090047645 | Dibenedetto | Feb 2009 | A1 |
20090048044 | Oleson et al. | Feb 2009 | A1 |
20090048070 | Vincent et al. | Feb 2009 | A1 |
20090233770 | Vincent et al. | Sep 2009 | A1 |
20090278915 | Kramer | Nov 2009 | A1 |
20090292178 | Ellis et al. | Nov 2009 | A1 |
20100009752 | Rubin | Jan 2010 | A1 |
20100042427 | Graham et al. | Feb 2010 | A1 |
20100088023 | Werner | Apr 2010 | A1 |
20100156653 | Chaudhari | Jun 2010 | A1 |
20100178994 | Do et al. | Jul 2010 | A1 |
20100184564 | Molyneux et al. | Jul 2010 | A1 |
20100188397 | Tsai et al. | Jul 2010 | A1 |
20100201352 | Englert | Aug 2010 | A1 |
20100292050 | Dibenedetto et al. | Nov 2010 | A1 |
20100292599 | Oleson et al. | Nov 2010 | A1 |
20100292600 | Dibenedetto et al. | Nov 2010 | A1 |
20110054270 | Derchak | Mar 2011 | A1 |
20110054271 | Derchak et al. | Mar 2011 | A1 |
20110054272 | Derchak | Mar 2011 | A1 |
20110054290 | Derchak | Mar 2011 | A1 |
20110082641 | Werner et al. | Apr 2011 | A1 |
20110087115 | Sackner et al. | Apr 2011 | A1 |
20110105861 | Derchak et al. | May 2011 | A1 |
20110119022 | Kuenzler et al. | May 2011 | A1 |
20110130643 | Derchak et al. | Jun 2011 | A1 |
20120015779 | Powch et al. | Jan 2012 | A1 |
20120083705 | Yuen et al. | Apr 2012 | A1 |
20120254934 | McBrearty et al. | Oct 2012 | A1 |
20120258433 | Hope et al. | Oct 2012 | A1 |
20120290312 | Coza et al. | Nov 2012 | A1 |
20130009761 | Horseman | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130095924 | Geisner | Apr 2013 | A1 |
20130171596 | French | Jul 2013 | A1 |
20130208234 | Lewis | Aug 2013 | A1 |
20130214998 | Andes | Aug 2013 | A1 |
20130274587 | Coza et al. | Oct 2013 | A1 |
20130274635 | Coza et al. | Oct 2013 | A1 |
20130274904 | Coza et al. | Oct 2013 | A1 |
20130286004 | McCulloch | Oct 2013 | A1 |
20140080638 | Feng | Mar 2014 | A1 |
20140187334 | Crossley | Jul 2014 | A1 |
20140266160 | Coza | Sep 2014 | A1 |
20140309058 | San Juan | Oct 2014 | A1 |
20140310595 | Acharya et al. | Oct 2014 | A1 |
20140330411 | Lochmann | Nov 2014 | A1 |
20140347265 | Aimone et al. | Nov 2014 | A1 |
20150328516 | Coza et al. | Nov 2015 | A1 |
Number | Date | Country |
---|---|---|
101224337 | Jul 2008 | CN |
101367012 | Feb 2009 | CN |
101701823 | May 2010 | CN |
102047199 | May 2011 | CN |
102728047 | Oct 2011 | CN |
102566053 | Jul 2012 | CN |
103550921 | Feb 2014 | CN |
1134555 | Sep 2001 | EP |
2 724 755 | Apr 2014 | EP |
07-96014 | Oct 1995 | JP |
2003274257 | Sep 2003 | JP |
WO 2002067449 | Aug 2002 | WO |
WO 2012014110 | Feb 2012 | WO |
Entry |
---|
Deltatre-Vizrt (“Deltatre and Vizrt expanding partnership for Magma Pro Football solution”, 2013, http://www.vizrt.com/news/newsgrid/39609/deltatre_and_Vizrt_expanding_partnership_for_Magma_Pro_Football_solution). |
Jebara et al. “Stochasticks: Augmenting the billiards experience with probabilistic vision and wearable computers.” Wearable Computers, 1997. Digest of Papers., First International Symposium on. IEEE, 1997 (Year: 1997). |
Yun, X., et al., “A Simplified Quaternion-Based Algorithm for Orientation Estimation From Earth Gravity and Magnetic Field Measurements, ”IEEE Transactions on Instrumentation and Measurement, vol. 57, No. 3, pp. 638-650, Mar. 2008. |
Shead, S., “Shirt Capable of Converting Body Heat into Electricity,” The Engineer, https://www.theengineer.co.uk/issues/14-november-2011/shirt-capable-of-converting-body-heat-into-electricity/, article, dated Nov. 3, 2011, accessed Mar. 16, 2013. |
“Deltatre and Vizrt exapanding partnership for Magma Pro Football Solution,” Deltatre-Vizrt, http://www.vizrt.com/news/newsgrid/39609/deltatre_and_Vizrt_expanding_partnership_for_Magma_Pro_Football_solution, article, dated Dec. 6, 2013. |
Non-English language Office Action issued in Chinese Application No. 201310129427.7, dated Dec. 29, 2014. |
Non-English language Office Action issued in Chinese Application No. 201310128838.4, dated Feb. 2, 2015. |
Concise explanation of Office Action issued in Chinese Application No. 201310129427.7, dated Dec. 29, 2014. |
Concise explanation of Office Action issued in Chinese Application No. 201310128838.4, dated Feb. 2, 2015. |
Extended European Search Report issued in European Application No. 15170976.3, dated Jan. 7, 2016. |
Non-English language Office Action issued in Chinese Application No. 201510362938.2, dated Sep. 4, 2017. |
Number | Date | Country | |
---|---|---|---|
20170251160 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14316425 | Jun 2014 | US |
Child | 15593954 | US |