The present disclosure relates generally to the field of object tracking, and more specifically to tracking objects using light detection and ranging (LIDAR).
The background description provided herein is for the purpose of generally presenting the context of the disclosure. The work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
Tracking objects moving in three dimensions may have various practical applications. For example, it may be useful to track the flight of balls or objects used in sports such as golf balls, baseballs, footballs, etc., to, among other things, simulate flight paths, predict initial launch conditions, etc. Traditional systems for tracking objects may encounter several limitations, such as motion blur at high speeds, problems reconciling variable lighting conditions, problems based on object and/or material color, and the size and/or shape of the object.
The following presents a simplified summary of the present disclosure in order to provide a basic understanding of some aspects of the disclosure. This summary is not an extensive overview of the disclosure. It is not intended to identify key or critical elements of the disclosure or to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the more detailed description provided below.
In an embodiment, the disclosure describes a system for determining kinematics of at least one object. The system may include a tracking computer including one or more processors and a memory, a camera system in electronic communication with the tracking computer and configured to capture images of the at least one object, and a light detection and ranging (LIDAR) system in electronic communication with the tracking computer and configured to detect the at least one object. The memory may contain processor-executable instructions that, when executed by the one or more processors, may cause the tracking computer to receive image data from the camera system and receive LIDAR data from the LIDAR system, and analyze the image data and the LIDAR data to determine flight characteristics of the at least one object.
In another embodiment, the disclosure describes a system for determining kinematics of at least one object. The system may include a tracking computer including one or more processors and a memory and a light detection and ranging (LIDAR) system in electronic communication with the tracking computer and configured to detect the at least one object. The memory may contain processor-executable instructions that, when executed by the one or more processors, cause the tracking computer to receive LIDAR data from the LIDAR system, and analyze the LIDAR data to determine flight characteristics of the at least one object.
In another embodiment, the disclosure describes a method of tracking a three-dimensional (3D) object. The method may include scanning for the 3D object using one or more LIDAR devices in one or more fields of view and detecting the 3D object using the one or more LIDAR devices. The method may include generating LIDAR data with the one or more LIDAR devices based on the detecting of the 3D object and transmitting the LIDAR data from the one or more LIDAR devices to a tracking computer. The method may include analyzing the LIDAR data to determine flight characteristics of the 3D object based on the LIDAR data.
Non-limiting and non-exhaustive embodiments are described in reference to the following drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the disclosure. In the drawings, like reference numerals refer to like parts through all the various figures unless otherwise specified.
For a better understanding of the present disclosure, a reference will be made to the following detailed description, which is to be read in association with the accompanying drawings, wherein:
Persons of ordinary skill in the art will appreciate that elements in the figures are illustrated for simplicity and clarity so not all connections and options have been shown to avoid obscuring the inventive aspects. For example, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are not often depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure. It will be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein are to be defined with respect to their corresponding respective areas of inquiry and study except where specific meaning have otherwise been set forth herein.
The present disclosure now will be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments by which the disclosure may be practiced. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Among other things, the present invention may be embodied as methods or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. The following detailed description is, therefore, not to be taken in a limiting sense.
Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment, although it may. Furthermore, the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment, although it may. Thus, as described below, various embodiments of the invention may be readily combined, without departing from the scope or spirit of the invention.
In addition, as used herein, the term “or” is an inclusive “or” operator, and is equivalent to the term “and/or,” unless the context clearly dictates otherwise. The term “based on” is not exclusive and allows for being based on additional factors not described, unless the context clearly dictates otherwise. In addition, throughout the specification, the meaning of “a,” “an,” and “the” include plural references. The meaning of “in” includes “in” and includes plural references. The meaning of “in” includes “in” and “on.”
In some embodiments, the disclosure describes systems and methods for providing improved tracking and analyzing of three-dimensional (3D) objects, such as golf balls in flight. In some embodiments, the disclosed tracking system may implement light detection and ranging (LIDAR) techniques and technology to determine kinematics and other characteristics of an object in flight, including but not limited to speed, spin, trajectory, launch angle, etc. In some embodiments, the tracking systems described herein may include the use of LIDAR technology alone, or may include a combination of LIDAR technology and vision technology (e.g., high speed cameras, etc.), to track and analyze flight characteristics of 3D objects. In some embodiments, the use of LIDAR technology may improve upon or overcome many of the shortcomings in traditional object tracking, including motion blur particularly at high speed, problems with variable lighting conditions, and problems with material color, size, and/or shape. In some embodiments, the tracking system and methods may be used in one or more practical applications, such as aiding in golf ball or golf club research and the development of new golf balls or golf clubs.
In some embodiments, the tracking system and methods may use LIDAR technology to scan a field of view through which a golf ball or other object may travel. In some embodiments, multiple LIDAR devices may be used that may each scan a field of view on multiple distinct axes, such as two or three axes that may be orthogonal to one another (e.g., x, y, and/or z axes). In some embodiments, a single LIDAR device may be used, or multiple LIDAR devices oriented to capture fields of view that may not be orthogonal to one another, so long as the angle to one another may be known and accounted for. In some embodiments, when the object (e.g., golf ball, baseball) moves through the field of view (FOV), the one or more LIDAR devices may measure the object's position over time. The position-time measurements may then be used to determine characteristics about the object's flight path, such as launch conditions (e.g., speed, angle, acceleration, etc.), direction of a club or bat striking the object (e.g., club face angle), etc. In some embodiments, vision technology (e.g., high speed cameras), may be paired with the LIDAR technology and may provide additional information regarding the object's flight characteristics. For example, a high speed camera may visually track the rotation of the object as it may move across the camera's field of view. In such embodiments, the tracking system may provide relatively high precision information regarding the object's flight path, spin, launch conditions, etc.
In some embodiments, the LIDAR devices 104 and the cameras 118 may be disposed up range 112 from a launch point 108 of the object 102, and the LIDAR devices and camera may be configured to track the object's flight characteristics down range 109 of the launch point 108. The object 102 may have a launch position 102a at the launch point 108, and may have multiple flight positions 102b down range of the launch point. In some embodiments, a user 110 may launch the object 102 down range 109 from the launch point 108, such as by striking the object with a golf club, baseball bat, or other implement. In some embodiments, the system 100 may include a triggering device at or near the launch point 108 that may be in electronic communication (wired or wirelessly) with the tracking computer 106, the camera 118, and/or the LIDAR devices 104. In some embodiments, the triggering device may be triggered when the object 102 is launched, either mechanically, optically, electronically, etc., and may transmit launch signals to the tracking computer 106, the camera 118, and/or the LIDAR device 104 indicating that an object 102 has been launched. In some embodiments, the object 102 may be a golf ball and the user 110 may strike the golf ball with a golf club.
The one or more LIDAR devices 104 may include a first LIDAR device 104a and a second LIDAR device 104b. LIDAR devices may generally be used to determine ranges to objects or surfaces using a laser or lasers that may include ultraviolet, visible, or near infrared light. The one or more LIDAR devices may be of any suitable type, such as incoherent or coherent, micropulse or high energy, spindle-type, solid-state, flash LIDAR. In some embodiments, the one or more LIDAR devices 104 may all be of the same design or may be of various designs. In some embodiments, the one or more LIDAR devices 104 may utilized a single scanning laser, or may utilized multiple scanning lasers. In some embodiments, the first LIDAR device 104a may emit pulses of one or more first lasers 105a and the second LIDAR device 104b may emit pulses of one or more second lasers 105b.
In some embodiments, lasers or other light may be emitted from each of the one or more LIDAR devices 104, some of which may strike the object 102 in flight down range 109. Portions of the lasers may be reflected back to the one or more LIDAR devices and detected by one or more detectors in each LIDAR device. The time between emission of a particular laser pulse and that pulse's arrival back at the detector may be used to determine the distance to a surface of the object 102 at a given point in time. Over time, as additional object surface locations may be located down range 109 during the object's 102 flight, the tracking computer 106 may receive and analyze each data point, and using mathematical techniques, extrapolate information about the object's flight path and trajectory or may determine instantaneous flight information.
In some embodiments, the LIDAR devices 104 may each repeatedly emit a single laser pulse that may be redirected across a one or two-dimensional field of view (FOV) at a predetermined frequency. Pulse repetition rates may vary by embodiment, but may range, for example, between several hundred pulses per second (e.g., 150-300 Hz) and hundreds of thousands of pulses per second (e.g., 150-300 KHz). In some embodiments, the LIDAR devices 104 may have a scan frequency of greater than 150 kHZ. In some embodiments, the LIDAR devices 104 may include lasers with a pulse wavelength between about 1,500 nanometers and about 2,000 nanometers. In some embodiments, the higher the pulse repetition rate and number of times the FOV may be scanned during the object's flight, the more accurate the tracking system 100 may be at determining flight characteristics such as launch conditions. Those skilled in the art will recognize that LIDAR, laser, and sensor technology may continue improving over time and such improvements may be reflected in the scope of the disclosure. In some embodiments, the single repeating laser pulses from each LIDAR device 104 may scan a given FOV one or more times during the object's 102 flight. In some embodiments, the LIDAR devices 104 may use multi scan lasers, where each device may emit multiple laser pulses at a time, allowing for increased scanning rates and/or higher resolution imaging than may be possible with single laser devices. When multiple laser pulses may be emitted at once, a larger portion of the FOV may be covered per unit of time and, accordingly, more information about the object's 102 positioning in the FOV per unit of time may be available for analysis by the tracking computer 106. Those skilled in the art will also appreciate that other LIDAR technologies may be implemented within the scope of the disclosure.
The LIDAR devices 104 may each be oriented to scan a particular field of view 107 down range 109 of an object 102 launch point 108. In some embodiments, the first LIDAR device 104a may be positioned and configured to scan a first field of view (FOV) 107a, and the second LIDAR device 104b may be positioned and configured to scan a second FOV 107b. In some embodiments, the first FOV 107a and the second FOV 107b may be substantially orthogonal to one another, such that FOV angle 116 may be about 90 degrees. In some embodiments, additional LIDAR devices 104 may also be used that may be positioned and configured to scan additional FOVs that may or may not be orthogonal to one or both of the first FOV 107a and the second FOV 107b. In some embodiments, each LIDAR device 104 may be configured to scan a portion of the FOV defined by a horizontal and/or vertical angle. The size or area of the first FOV 107a may be determined based on a first horizontal scan angle 114a and a first vertical scan angle, and the second FOV 107b may be determined based on a second horizontal scan angle 114b and a second vertical scan angle. In one, non-limiting example, the first and second horizontal scan angle 114a, 114b may be about 20 degrees, and the first and second vertical scan angle may be about 40 degrees, although those skilled in the art will recognize that other angles may be used. Relatively smaller FOVs may be scanned faster than larger FOVs and therefore may allow for higher resolution imaging and greater accuracy, all else equal. However, if a FOV is too small, the object 102 may not appear in the FOV at all, or may only appear for a short amount of time. A relatively larger FOV may provide for greater likelihood of object detection, but may limit resolution due to longer scan times and relatively fewer data points, all else equal.
In some embodiments, image resolution may be increased without decreasing the size of the FOV by using additional LIDAR devices 104. Multiple LIDAR devices may work in tandem to scan down range 109 more quickly, or provide for additional scanning density in the same amount of time as a single LIDAR device would take to cover the same area. For example, using two LIDAR devices 104 instead of one LIDAR device may allow the system 100 effectively double number of data points acquired during a given time period, or may expand the down range area covered during that time period. Of course, those skilled in the art will recognized that virtually any number of LIDAR devices may be used to increase the resolution, increase the scanned area, reduce scan time, or all three.
In some embodiments, scan time may be reduced (and thus resolution increased) by using LIDAR devices with multi scan lasers, as discussed above. In some embodiments, the tracking system 100 may benefit from using relatively small FOVs, for example, about 20 degrees horizontally by about 40 degrees vertically. In some embodiments, the scanned field down range 109 may become dense with laser pulses such that laser pulses from the first LIDAR device 104a and the second LIDAR device 104b (or additional LIDAR devices) may interfere with one another. In other words, there may be cross talk between the multiple laser/receiver pairs in the LIDAR devices 104. In some embodiments, this interference may be reduced by assigning a unique wavelength to each laser/receiver pair in each LIDAR device 104. In such embodiments, each respective receiver may be programmed or otherwise configured to only recognize or to only process lasers received at the unique wavelength assigned to that particular LIDAR device. For example, each LIDAR devices or the receivers themselves may include a narrow bandpass optical filter, matched to each respective laser, that may block light from the other lasers (i.e, other wavelengths). Alternatively (or in addition), tracking software may be programmed or configured to only recognize (i.e., only “count”) returns from lasers having the wavelength assigned to that particular LIDAR device. In some embodiments, another way to reduce and/or avoid interference may be to separate each laser/receiver pair by timing. In such embodiments, each LIDAR device 104 may be configured to emit laser pulses at times when the other LIDAR devices may not be emitting laser pulses. For example, the first LIDAR device 104a may emit a laser pulse or multi pulse 105a first, the second LIDAR device 104b may emit a laser pulse or multi pulse 105b second, and the pulses may then be offset so as to never overlap in time and thus reduce or avoid interference. In some embodiments, each LIDAR device 104 may be in electronic communication with one another to effectively stagger the laser pulses, or may be coordinated via the tracking computer 106 or another computing device. In some embodiments, the distance from the receiver to the object 102 may be relatively short, such as less than 10 feet. In such embodiments, the time for each laser pulse to reflect from the object 102 and return to the receiver may be relatively short, such as in the range of about 20 nanoseconds. In one non-limiting example, coordinating the LIDAR devices such that the laser pulse(s) from each LIDAR device 104 may be emitted once every 25 nanoseconds may allow the system 100 to take about 40,000,000 measurements per second.
In some embodiments, the disclosure describes implementing a dynamic FOV method that may reduce the size of the FOV that each LIDAR device may scan per unit time based on object flight characteristics determined on previous scans. For example, an FOV that may be 20 degrees by 40 degrees and containing a single object 102 (e.g., golf ball) that may be about two feet from the LIDAR devices may still be mostly empty space, as the object may subtend about 4 degrees of the FOV. Further, the scans that result in a signal reflected from the object provide the most information for analyzing the object's 102 flights characteristics. Accordingly, in some embodiments, the disclosure describes using the location of the object 102 to limit the FOV of each LIDAR device 104 to improve the useful scan rate, resolution, or both. In some such embodiments, each LIDAR device 104 may scan the entire FOV (e.g., 20 degrees by 40 degrees until an object may be detected (i.e., until returns from the laser pulses indicate an object in the FOV). Subsequent scans may then limit the FOV to an area within the original FOV where the object has been detected, and may in some embodiments include some margin for error (e.g., about 8 degrees by 8 degrees). In some embodiments, the FOV for subsequent scans may be decreased significantly (e.g., decreasing the field of view by 98%), thereby improving scan rates and/or improving resolution. In some embodiments, information regarding the speed and/or trajectory of the object may be used to predict subsequent FOV areas that may be most likely to include the object. Further, in some embodiments, the FOV may be additionally decreased as the distance to the object 102 down range 109 increases, as the object and any unpredictable object movements subtend a smaller angle. In embodiments where the system 100 uses multiple LIDAR devices 104, each laser may be redirected independently to efficiently scan the reduced field of view, further improving resolution and scan rate.
In some embodiments, the LIDAR devices 104 themselves may be programed or physically configured to perform the dynamic FOV method described herein and with respect to
In some embodiments, the one or more LIDAR devices 104 may achieve high enough resolution to map unique physical features of the surface of the object 102 in flight. In such embodiments, the LIDAR devices 104 may be able to determine additional object flight characteristics, such as spin. For example, the LIDAR devices 104 (or the tracking computer 106) may differentiate between the position of a particular physical surface feature at a first time (e.g., during a first scan) and the position of that same physical surface features at a second time (e.g., during a subsequent scan). The LIDAR device 104 and/or the tracking computer 106 may determine the spin of the object 102 based on, for example, the elapsed time between the first time and the second time, the distance that the physical surface feature has moved, a diameter of the object, the speed of the object, etc. In embodiments where the object 102 may be a golf ball, the golf ball surface may be substantially covered in dimples and/or concave craters surrounding the ball surface. In some embodiments, the LIDAR device 104 may differentiate between dimples on the golf ball such that object characteristics such as spin rate may be determinable. In other words, given enough granular definitions the LIDAR device 104 may capture substantially all the relevant object flight information within the field of view, even without the camera 118, as described below. In some embodiments, the system 100 may also or alternatively use Doppler LIDAR to sense the speed and/or acceleration of the object 102. Further, in some embodiments, spin of the object 102 may be inferred based on a speed difference between a top of the object and the bottom of the object. Of course, those skilled in the art will recognize that any of the above embodiments may also be used in combination with one another.
In some embodiments, substantially conventional LIDAR devices may be used or specially programmed to perform the tracking activities described herein. For example, LIDAR devices having a rotating mirror to direct the laser beam(s) for scanning an area may be used in some embodiments. In such embodiments, however, the scan rate of the FOV may be fixed to or otherwise related to the rotational speed of the mirror. In some embodiments, such as where a more flexible or dynamic scan rate may be desirable, the LIDAR devices 104 may include one or more mirrors that may move in alternative ways. For example, a voice coil may be connected to the mirror so as to enable the voice coil to rotate, bend, or otherwise alter the shape of the mirror and thus control reflected direction of the laser beams. In such embodiments, the laser beam may be directed to scan the FOV by altering a current flowing through the voice coil. In some embodiments, the current may be controlled by the tracking computer 106, or may be controlled by a separate controller internal to the LIDAR device, or by any other suitable control mechanism. In such embodiments, the FOV, range, and scan rate may be changed programmatically as desired for a given application, or to alter the FOV such as with reference to
In some embodiments, the system 100 may also include one or more visual trackers, such as cameras 118. The camera(s) 118 may be a high speed camera that may be capable of capturing images with relatively low exposure times, such as less than 1/1,000 second or may use frame rates in excess of 250 frames per second (fps). In some embodiments, it is contemplated that relatively lower speeds and fps may also be used. In some embodiments, the camera 118 may be capable of and/or configured to record and store images of the object 102 as it moves from the launch point 108 into a down range area 109. In some embodiments, the images captured by the camera 118 may be stored and played back, or be stored and used for analysis. In some embodiments, other types of cameras or visual trackers may be used within the scope of the disclosure. In some embodiments, the camera 118 may be configured to capture images in a sequential series of two or more images. In some embodiments, the camera 118 may be configured to capture images at a frame rate of greater than 200, 300, 400, or 500 frames per sec. In some embodiments, the camera 118 may be configured to capture images at a frame rate of greater than 600, 700, 800, 900, or 1000 frames per sec.
In some embodiments, the vision portion of the tracking system 100 may be used in combination with a marker detection algorithm to measure rotation of the object 102 as it moves across a field of view (FOV) of the camera system 118. In some embodiments, the camera 118 may be capable of acquiring multiple images within a set field of view. Each camera 118 may include a strobing lighting element that may be tied to a triggering device that may synchronize itself with the LIDAR devices 104. In some embodiments, the strobing lighting element may provide strobe lighting that may accentuate object movements or spin during flight that may be detectable with the camera system 118. In such embodiments, the camera 118 may be used to scan for predetermined patterns on the surface of the object. In some embodiments, the predetermined pattern may be made up of any rotationally unique marking that may be previously applied to the object. In some embodiments, if the markings are not unique geometric shapes, the markings may contain at least n+3 unique markings with an asymmetric layout.
In some embodiments, the LIDAR devices 104 may be used alone to track and monitor object flight characteristics, or may be used in combination with the vision system including the one or more cameras 118.
The physical elements that make up an embodiment of a server, remote cloud server 600, are further illustrated in
A database 1525 for digitally storing structured data may be stored in the memory 1510 or 1515 or may be separate. The database 1525 may also be part of a cloud of servers and may be stored in a distributed manner across a plurality of servers. There also may be an input/output bus 1520 that shuttles data to and from the various user input devices such as a microphone, a camera, a display monitor or screen, etc. The input/output bus 1520 also may control communicating with networks either through wireless or wired devices. In some embodiments, a launch tracker controller for running a launch tracker API may be located on the computing device 106. However, in other embodiments, the launch tracker controller may be located on server 600, or both the computing device 106 and the server 600. Of course, this is just one embodiment of the server 600 and additional types of servers are contemplated herein.
The foregoing description and drawings merely explain and illustrate the invention and the invention is not limited thereto. While the specification is described in relation to certain implementation or embodiments, many details are set forth for the purpose of illustration. Thus, the foregoing merely illustrates the principles of the invention. For example, the invention may have other specific forms without departing from its spirit or essential characteristic. The described arrangements are illustrative and not restrictive. To those skilled in the art, the invention is susceptible to additional implementations or embodiments and certain of these details described in this application may be varied considerably without departing from the basic principles of the invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements which, although not explicitly described or shown herein, embody the principles of the invention and, thus, within its scope and spirit.