Tracking systems have used imagers capturing, for example, sequences of images of a target area including a moving object, to determining the position, speed and or trajectory of objects. However, these systems, which generally employ two-dimensional imagers, have been unable to provide the desired level of accuracy under certain conditions. In addition, radar tracking systems used to track objects have also suffered from certain deficiencies.
Prior systems have attempted to cure these defects by combining pulse radar with imagers to track items such as planes and missiles. However, such systems are generally unsuitable for tracking fast moving objects, such as sports balls, at short distances near the ground and other objects. For example, a pulse radar system determines the range to an object by transmitting signals at high power and determining a time required for the return of a portion of these signals reflected by the object. As only a very small fraction of the transmitted signal returns from the distant targets tracked by such systems, the receivers must be sensitive to faint signals orders of magnitude smaller than the transmitted signals. Thus, the receivers of such systems must be shut down when the pulse radar is transmitting or these highly sensitive receivers will be saturated or damaged by the high-power signal. The time required for the switch-over from transmission to receiving determines a minimum target range detectable by the pulse radar—generally on the order of several hundred meters. Furthermore, these systems are generally unable to discriminate between different objects at similar positions (e.g., a moving object passing close by a stationary object or an object moving on a different path). These systems are, therefore, not well suited to situations involved in, for example, tracking an object such as a sports ball passing through an area near the ground and including other objects such as moving or stationary players, trees, etc.
The present embodiments are directed to a system for tracking the movement of an object comprising a radar device having a first field of view, the radar device generating radar data indicating one of a range corresponding to a distance of a moving object within the first field of view from the radar device and a range rate corresponding to a rate at which the distance is changing relative to the radar device and an imager having a second field of view at least partially overlapping the first field of view in an overlap field of view, the imager generating imager data measuring, when the object is in the second field of view, an angular position of the object relative to the imager in at least one dimension in combination with a processor combining the radar data and imager data, when the object is in the overlap field of view, to identify a track of the object in at least two dimensions.
In an embodiment, the radar device is a one-dimensional radar and wherein the radar data includes a range rate for the object.
In an embodiment, the imager is a two-dimensional imager and the imager data measures the angular position of the object in at least two dimensions, the processor identifying the track of the object in three dimensions.
In an embodiment, the image tracking device measures vertical and horizontal angles in an imager coordinate system.
In an embodiment, the processor includes data corresponding to a separation vector indicating a distance and orientation from the radar device to the imager.
In an embodiment, the processor calculates a unity vector from the imager to the object and, based on the unity vector, the radar data and the separation vector, the processor calculates the position of the object in three dimensions.
In an embodiment, the processor defines a field based coordinate system based on reference points within the overlap field of view and translates and rotates the position of the object in three dimensions into the field based coordinate system.
In an embodiment, the radar device detects a distance to the object and one of a horizontal and vertical angle to the target.
An embodiment also includes a memory storing a priori information predictive of a location in which the object is to be found.
In an embodiment, the processor uses the a priori information to define a region of interest within which the object is expected to appear as a reduced portion of one of the first and second fields of view.
In an embodiment, the a priori information includes at least one of information about a prior position of the object and a prior speed of the object and a prior range of the object.
In an embodiment, the object is a sports ball and wherein the a priori information concerns a location from which the ball is likely to be put into play.
In an embodiment, the object is a sports ball and the target volume includes a field of play and wherein the reference points include locations significant to the rules of play of a game to be played on the field of play.
In an embodiment, the radar device is a Doppler radar.
In an embodiment, the processor determines the distance based on the range rate from the radar device and an initial value for the range.
In an embodiment, the initial range value is based on a priori knowledge.
A method for tracking the movement of an object according to an embodiment comprises positioning a radar device aimed so that first field of view covers at least a portion of a target volume through which an object is to move, the radar tracking device generating radar indicating a distance of a moving object within the first field of view from the radar and positioning an imager aimed so that a second field of view at least partially overlapping the first field of view within a desired portion of the target volume, the imager generating imager data measuring, when the object is in the second field of view, an angular position of the object relative to the imager in at least two dimensions in combination with combining the radar data and imager data, when the object is in the overlap field of view, to identify a track of the object in three dimensions.
The exemplary embodiments may be further understood with reference to the following description and the related appended drawings, wherein like elements are provided with the same reference numerals. The exemplary embodiments relate to a device, a system, and a method for tracking objects by combining data from one or more imagers and one or more radar devices. Although exemplary embodiments detailed herein describe the tracking of baseballs and golf balls, those skilled in the art will understand that any sports balls or even non-sports related objects may be tracked with the system in the same manner. In the same manner, the system may track a baseball bat, a golf club or any other item that is detectable in the image and that generates a signal in the radar data.
It is noted that the first system 100 including a single imager 104 and a single radar 102 is only exemplary. In other configurations, there may be one or more imagers capturing one or more images and/or one or more radar devices obtaining radar information. In an exemplary embodiment shown in
The radar tracking device 102 may be any radar configured to measure reflected radiation to detect a range, position, velocity and/or spin of an object. The radar tracking device 102 may, for example, be a continuous wave Doppler radar emitting microwaves at X-band (10.5-10.6 GHz) emitting approximately 500 milliWatts EIRP (Equivalent Isotropic Radiated Power), thus being suitable for complying with FCC and CE regulations for short range intentional radiators. Any type of continuous wave (CW) Doppler radar may be used, including phase or frequency modulated CW radar, multi frequency CW radar or a single frequency CW radar. Tracking of objects may be based on the use of Doppler frequency spectrums. As would be understood, Doppler frequency spectrums refer to the data from continuous wave Doppler radar. Any other type of radar capable of tracking objects similar to those described herein may also be used, whether they track one-dimensionally, two-dimensionally or three-dimensionally. The radar device 102 has a FoV. As seen in
The imager 104 may be any device configured to capture an image of the target area, the image can be configured to received radiation in the visual or non-visual spectrum (such as infra red). For example, the imager may be a still camera or a video camera. Therefore, a single image of the target area may be captured or a series of images may be captured sequentially over a period of time. The image(s) may be captured using any of a variety of techniques to generate different types of images (e.g., black and white, color, etc.). The imager 104 may also be configured with various features such as variable zoom or enlargement of elements in the target area, a selectable shutter time, a selectable number of images per time interval (e.g., frames per second), etc. As seen in
The radar 102 generates data to measure at least one dimension. For example, a one-dimensional radar generates data indicating a range and/or range rate (collectively or individually referred to herein as “range”) to a target; a two-dimensional radar generates data indicating range and/or range rate as well as a vertical angle or a horizontal angle; and a three-dimensional radar generates data indicating range and/or range rate, a vertical angle and a horizontal angle. The imager 104 captures an image including two or three dimensions. For example, the imager 102 may capture a two-dimensional image permitting measurement of a vertical angle and a horizontal angle or a three-dimensional image permitting identification of a three-dimensional location of a target (i.e., measuring a range as well as vertical and horizontal angles). The embodiment of system 100 utilizes a one-dimensional radar 102 and a two-dimensional imager 104. However, further embodiments may utilize any combination of the radar 102 and imager 104 such that information of all three dimensions (e.g. range, vertical angle and horizontal angle) are captured at selected time periods. For example, a three-dimensional radar and a three-dimensional imager provide redundant data that may be used to verify results and/or increase a level of confidence in their accuracy.
As illustrated in the views of
As will be described in greater detail below, the radar 102 and imager 104 capture radar data and images, respectively, of an object along a path in a time-synchronized manner. That is, a first radar data and first image may correspond to a position of the target at a first time while the object is traveling along the path. The radar 102 and imager 104 both track the object during a period of time to generate radar data and images, respectively, that are time synchronized by the system 100 (as described below) when the path is within the FoV 112 and FoV 114. Accordingly, the FoV 112 of the radar 102 and FoV 114 of the imager 104 must overlap in an overlap area 116, as seen in
As seen in
Radar device 102 and imager 104 are positioned at an initial known position and orientation relative to one another and to the target area. As described above, in an exemplary embodiment, the tracking devices 102, 104 are positioned behind and elevated above the target area (i.e. baseball field). The radar device 102 and the imager 104 are positioned a known distance, t, from one another, as shown in
The system 100 includes a data processing system 200 which, as would be understood by those skilled in the art, may include one or more computers coupled to the radar device 102 and the imager 104 via either wired or wireless connection. In an exemplary embodiment, a single computer 201 is used to perform the radar and image tracking as well as the merging of the data output from the radar device 102 with the data from the imager 104. However, in another exemplary embodiment, the data processing system 200 includes separate computers 202, 202′, each associated with a corresponding one of the radar device 102 and imager 104 as well as a central computer 204 that coordinates data from the two computers 202, 202′, as shown in
Data from the radar device 102 and the imager 104 are time synchronized to ensure accuracy of the tracking information based on the combination of data from these two sources. For example, when correlating range information from radar device 102 to frames captured by the imager 104 relating to a baseball thrown by a pitcher, the level of accuracy in calculating the position of the baseball is increased when the times at which each of these data points was captured are properly matched to one another. That is, to ensure accurate tracking, it is necessary to ensure that the data merged from the radar device 102 and the imager 104 correspond to the position of the ball at the same time (or as nearly the same time as possible). As noted above, the radar 102 may be a CW Doppler radar generating radar data at time intervals much shorter than the frame rate of the imager (i.e., the number of frames captured per second). Thus, even though there is not a 1:1 correspondence between the data points, it is necessary to match the radar data to the frame from the imager 104 most nearly synchronous therewith to accurately determine the trajectory of the baseball. To increase time match between respective captured images during this short period of time, the radar device 102 and imager 104 may be hardwired together so that, for example, the imager 104 may provide to the radar device 102 a signal indicating a timing of the capture for each frame so that an attached or integral computer may determine a time correspondence between images and radar data. Alternatively, a similar signal may be provided by the radar device 102 to the imager 104. For example, the imager 104 may receive a signal from the radar device 102 signaling when to take a picture or signaling to take a picture f.ex. at every 20 milliseconds (ms). In another example, the imager 104 can send a pulse to the radar device 102 signaling at what time each image frame was taken. Software on the data processing system 200 may then match the imager's image data to radar data from the radar device 102. In another example, every time the imager 104 takes an image, a signal may be sent to the radar device 102, which marks the time at which the frame was captured (making an allowance for a signal and processing delay) to sync the images from the imager 104 image to the corresponding radar data.
The flow chart of
In step 310, the radar device 102 records radar data and the imager 104 records image data which is transmitted to the computer 201. As describe above, the radar 102 generates radar data corresponding to the object moving in the target area within a FoV 112. The imager 104 may capture images of a target area within a FoV 114 in which the object is moving. According to exemplary embodiments, it may be assumed that the object is moving in the target area within an overlap area 116 of the FoV 112 and the FoV 114.
In step 320, the computer 201 detects whether a target object is captured in either the radar data generated by the radar 102 or the images captured by the imager 104. The object detection may be performed using any identification mechanisms. For example, the computer 201 may utilize pattern recognition algorithms to detect the presence of a target object in one or more images. In another example, the tracking device 102 may detect a moving object in the radar data as well as restricting a search area. As will be described in more detail below, the system 100 may use a priori knowledge to define a region of interest (ROI) as a selected portion of the FoV 114 to be searched for the target object. For example, for baseball tracking, the system 100 may define an ROI as a portion of an image including the pitcher's mound 108 and some surrounding area as every pitch originates from this area. This reduces the computational burden and can accelerate the identification of the target object in the image data. Similarly, the system 100 can define a target range as a distance to the mound 108±a predefined margin as an ROI of data to be searched within the radar data. Movement detected within the ROI may, therefore, more quickly be identified as a pitched baseball 106.
In step 330, the computer 201 makes a subsequent measurement corresponding to a new position of the baseball 106 in one or both of the radar data and the image data. Based on this new data, the system 100 defines, in step 340, a new ROI for the radar (i.e., a new range within which the subsequent detection of the ball is expected) and a new ROI for the imager (i.e., a portion of the subsequent image (frame) within which the ball is expected to be located). Using the previous location and the range and/or range rate data from the radar, the system 100 predicts where the ball 106 will be at the subsequent radar reading and in the subsequent frame to define new ROI's for each of the radar 102 and the imager 104. The method then proceeds to steps 350 in which the system 100 tracks the object (i.e., locates the object within the ROI's of the radar device 102 and the imager 104). For the radar 102 the ROI also includes region of interest in range and/or range rate. As can be seen in the flowchart, data from the tracking based on radar data may be employed in the determination of the ROI for the imager 104 and data from the tracking based on imager data may be employed in the determination of the ROI for the radar device 102. As indicated above, in step 340, information from either the radar device 102 or the imager 104 about the position of the baseball 106 may be used to limit a portion of the respective FoVs 112, 114 to be searched by defining an ROI. In a first example, once an object has been identified by the radar device 102, the system 100 may define an ROI as a subset of the total FoV 112 as well as limitation of range and/or range rate which leads to a computational reduction for the radar data as the entire FoV 112 and range/range rate is not required to be analyzed for the presence of the ball 106. That is, only a portion of the FoV 112 may be analyzed while ignoring the remainder of the FoV 112, and similar only a part of the range and/or range rate are analyzed while ignoring the rest, which likely has no relevant information associated with the ball's trajectory. Similarly, based on a priori information or a prior location and/or trajectory of the ball 106, the system may set an ROI for the imager 104 as a subset of its total FoV 114 which leads to a computational reduction for the imager data since the entire FoV 114 need not be analyzed for the presence of the ball 106. That is, only a portion of the FoV 114 may be analyzed while ignoring the remainder of the FoV 114, which likely has no relevant information associated with the ball's trajectory. Similar to step 340, the radar track of the radar 102 may now be used to confirm the region of interest for the imager 104. Furthermore, if the imager 104 previously defined a region of interest in step 340, the radar track may identify an even smaller region of interest, resulting in further computational reduction for the imager data. Positive conformation of the presence of the ball 106 from the imager 104 may confirm the radar data in setting the region of interest. Alternatively, a failure of the image data to confirm the presence of the ball 106 may be used to reject the radar track. Furthermore, an image track of the imager 104 may confirm the region of interest for the radar 102. Furthermore, if the radar 102 previously defined a region of interest in step 340, the image track may identify an even smaller region of interest, resulting in further computational reduction of the radar 102. Positive confirmation of the presence of the ball 106 from the radar 102 may confirm the image track in setting the region of interest. Alternatively, if the radar device 102 fails to confirm the presence of the ball 106 from the image data, this may be used to reject the image track. As would be understood by those skilled in the art, false image tracks may occur due to challenging light conditions, artifacts in images, similar false detection scenarios, etc.
Thereafter, in step 360, the tracking data from the radar and imager are merged to calculate a three-dimensional position of the ball 106 (e.g., translated into the field coordinates) and this data may be provided as a live data stream available for other applications, such as overlaying tracking results on live video or 2D/3D computer graphics, f.ex. for broadcast purpose. In step 370, the system 100 determines whether this represents the end of the track. If yes, the method proceeds to step 380 in which output data (e.g., calculating the break of a pitch, etc.) is generated and, if no, the method 300 returns to step 330. Specifically, the baseball 106 is tracked from a first image when the baseball 106 is traveling along the trajectory to a final image when the baseball 106 impacts an object such as a bat, a glove, or the ground. The baseball 106 is also tracked from a first location in which the baseball 106 is identified in the radar or image data until a final location in which the baseball 106 has stopped or has deviated its course above a deviation threshold (e.g., velocity vector changes direction). At this point, if the trajectory has deviated beyond a threshold value (e.g., if the ball 106 is hit by the batter), the system 100 may begin following the new trajectory and the method 300 will recommence defining new ROI's based on the new trajectory until the new trajectory ends.
Steps 330-370 may be repeated at each time interval for which new radar measurements or new image frames are generated. For example, in an exemplary system, the computer 201 may perform step 330 every 20 ms (or 50 times per second) for each new frame recorded, or perform step 330 every time a new radar measurement is taken which is typically much more frequent such as 1-5 ms. For each radar measurement taken by the radar device 102, the computer 201 may calculate the range to the ball 106 using raw Doppler radar data.
As described above, in step 360, the radar track and the image track are merged to determine the position of the ball 106 in a three-dimensional coordinate system (e.g., the field-based coordinate system). In the exemplary embodiment described above, the radar 102 may generate one-dimensional radar data measuring ranges and the imager 104 may capture two-dimensional imager measuring vertical and horizontal angles. The computer 201 may merge the information such that all three parameters (e.g., dimensions) are known for the trajectory of the object. As shown in
where (upx, vpx) 106′ is the pixel value in the image, (ppu, ppv) is a principal point within the image (typically very close to the center of the image) and f is the focal distance, which are predetermined for the imager 104.
The normalized pixel (u,v) is then converted to a unity vector nc 506 using the equation [2] by the computer 201.
nC=(αC,βC,γC)=(1,−v,u)/√{square root over (1+u2+v2)}. [2]
Upon installation, the displacement vector t500 between the radar 102 and the imager 104 (and its length t=|t|) is determined. Thus, when the ball 106 is detected in the image at the pixel position (uPX,vPX) 106′ from the imager 104 shown in
The previously described vector nc 506 is the unity vector measured from the imager 104 and a vector n 508 is a unity vector measured from the radar 102 towards the ball 106. Similar to the unity vector nc 506 in equation [2], then unity vector n has the coordinates n=(α,β,γ). An angle 510 (denoted as φR) is defined as the angle between the vector n 508 and the vector t500 while an angle 512 (denoted as φT) is defined as the angle between the vectors n 508 and nc 506 and an angle 514 (denoted as φc) is defined as the angle between the vectors nc 506 and t500. Then φc 514 is equal to: a cos(dot(nc,t/|t|)), where dot( ) denotes the vector product. After determining φc 514, φT 512 may be determined as equal to: a sin((|t|/R)(sin φc)). After determining φT 512, φR 510 may be determined as equal to: π−(φT+φC) in radians. The distance Rc 504 may subsequently be determined based on the angles as equal to: R·(sin(φR)/sin(φc)). Thus, the combined three-dimensional tracking (denoted by the position vector X with origin the radar position 102) of the baseball 106 may be determined as X=R·n=Rc·nc−t. Alternatively the three-dimensional position of the ball 106 might be determined with origin at the imager position 104 represented by the vector Xc=Rc·nc. In the above, the calculation method has been explained with an a priori known three-dimensional displacement vector t between the radar 102 and the imager 104. There are no limitations on the distance t of the displacement nor the orientation of t as long as the FoV 112 and 114 overlap. If the distance t f.ex. is small relative to the distances R 502 and Rc 504, then Rc can be assumed to equal R with only a small uncertainty, thus simplifying the calculations. This corresponds to assuming that t equals the null vector 0, which is the situation where the imager 104 is placed in the origin of the radar 102.
It is again noted that the imager 104 and the radar device 102 are time synchronized such that the information extracted from the images and the radar data may be correlated to one another. Through the time synchronization, the computer 201 merges the parameters to determine a three-dimensional tracking of the baseball 106 and ultimately the trajectory 316. Fore example, the computer 201 may be configured to execute a trajectory application. The trajectory application may receive parametric information of the trajectory 316 from the data merging application. The trajectory application may determine the trajectory based on the parametric information. For example, the trajectory application may determine a first three-dimensional location of the object 106 in a target area 122 given first parametric information at a first time and subsequently determine a second three-dimensional location of the object 106 in the target area 122 given second parametric information at a second, later time. Using the first and second three-dimensional locations in the target area 122, the trajectory application generates a tracklet that connects the first and second three-dimensional locations. The trajectory application may continue to perform these operations to determine further tracklets at later times. The tracklets may then be connected in chronological order to determine the trajectory of the ball 106.
Those skilled in the art would understand that in another exemplary embodiment, the radar 102 may capture three-dimensional information measuring the vertical and horizontal angles as well as the range to the ball 106. In this instance, n may be directly determined from the radar data. Accordingly, the vertical and horizontal angle data from the radar and the imager may be weighted to obtain more accurate results. Typically vertical and horizontal angle data received from the imager 104 will be more accurate and will be weighted more heavily. However, in some cases (e.g., background and ball are similar in color, another object partially blocks the ball or occlusion in the image) radar data may be more accurate. In another exemplary embodiment the imager 104 and the radar device 102 may both individually capture three-dimensional images measuring vertical and horizontal angles as well as ranges. In this instance, the computer 201 may merge the information such that the vertical angle measurements are merged together, horizontal angle measurements are merged together, and ranges are merged together such that all three parameters utilize redundant information to enhance accuracy.
It is again noted that the use of a baseball is only exemplary. The exemplary embodiments may be utilized to track any object or part of an object within a target area that may be identified in both the images and the radar data. For example, the object may be any other type of ball, a club head on a golf club, a golf shaft, baseball bat, tennis racket etc.
In another exemplary embodiment in which system 100 may be implemented, a trajectory of a golf ball that is being putted may be determined. The following relates to determining the trajectory of the golf ball and incorporating all of the above-described features in which data derived from images captured by a two-dimensional imager and data derived form a one-dimensional radar are merged and analysis of images are reduced to a region of interest as defined by a radar track. Even though the below example shows tracking of a golf ball being putted, the exact same method can be used for any type of golf shots or any other moving object as long as it is detectable in both radar data and imager data.
The exemplary embodiments provide a device, system, and method to determine a three-dimensional trajectory of an object in which information from images captured by an imager and radar data generated by a radar are merged. The information is merged in such a way that each dimension of the trajectory is determined from a source that provides corresponding information that has a certainty above an acceptable threshold. Redundant information may also be utilized in corroborating a parameter determination. Through merging the information based on a time synchronization, the three-dimensional trajectory may be generated in a more robust, accurate, and versatile manner.
Those skilled in the art will understand that although the previously described embodiments describe a baseball pitch in detail and a golf ball putt more briefly, the above-described exemplary embodiment may be implemented to track the movement of any object in various trajectory types (i.e., free flying, bouncing or rolling). For example, the system 100 may also track a tennis ball at any part of a rally, including a serve. In another exemplary embodiment, the described system 100 may also be used to track a soccer ball, in particular higher velocity kicks at a goal. In a further exemplary embodiment, the described system 100 may be used to track a bowling ball when sliding and rolling down the bowling lane. Those skilled in the art will also understand that the system 100 is not limited to tracking the movement of spherical objects but may track the movement of any type of object. For example, the system 100 may track the movement of a baseball bat or a golf club when swinging at a ball. The system 100 may also be used to track parts of an athlete such as the hand of the baseball player throwing a ball, or hand trajectory of a golfer. Furthermore, the system 100 may also be used to track an alpine skier down a ski slope, or a ski jumper both on the ramp and during the flight and landing.
Those skilled in the art will understand that the above-described exemplary embodiments may be implemented in any suitable software or hardware configuration or combination thereof. An exemplary hardware platform for implementing the exemplary embodiments may include, for example, an Intel x86 based platform with compatible operating system, a Windows platform, a Mac platform and MAC OS, a mobile device having an operating system such as iOS, Android, etc. In a further example, the exemplary embodiments of the above-described method may be embodied as a program containing lines of code stored on a non-transitory computer readable storage medium that may be executed on a processor or microprocessor.
It will be apparent to those skilled in the art that various modifications may be made in the present disclosure, without departing from the spirit or the scope of the disclosure. Thus, it is intended that the present disclosure cover modifications and variations of this disclosure provided they come within the scope of the appended claims and their equivalent.
Number | Name | Date | Kind |
---|---|---|---|
3025520 | Werner et al. | Mar 1962 | A |
3264643 | Nilssen | Aug 1966 | A |
3324468 | Knepper | Jun 1967 | A |
3540054 | Broderick | Nov 1970 | A |
3777665 | Ziemba | Dec 1973 | A |
3798644 | Constant | Mar 1974 | A |
3798795 | Michelsen | Mar 1974 | A |
3856237 | Torian et al. | Dec 1974 | A |
3974740 | Billottet et al. | Aug 1976 | A |
3981010 | Michelsen | Sep 1976 | A |
3982713 | Martin | Sep 1976 | A |
3992708 | Olson et al. | Nov 1976 | A |
4015258 | Smith et al. | Mar 1977 | A |
4050068 | Berg et al. | Sep 1977 | A |
4264907 | Durand, Jr. et al. | Apr 1981 | A |
4477814 | Brumbaugh et al. | Oct 1984 | A |
4509052 | Cash | Apr 1985 | A |
4545576 | Harris | Oct 1985 | A |
4563005 | Hand et al. | Jan 1986 | A |
4622554 | Gellekink et al. | Nov 1986 | A |
4638320 | Eggert et al. | Jan 1987 | A |
4639733 | King et al. | Jan 1987 | A |
4713686 | Ozaki et al. | Dec 1987 | A |
4717916 | Adams et al. | Jan 1988 | A |
4751511 | Komata et al. | Jun 1988 | A |
4780719 | Frei et al. | Oct 1988 | A |
5018218 | Peregrim et al. | May 1991 | A |
5056791 | Poillon et al. | Oct 1991 | A |
5092602 | Witler et al. | Mar 1992 | A |
5134409 | Groot | Jul 1992 | A |
5138322 | Nuttall | Aug 1992 | A |
5150895 | Berger | Sep 1992 | A |
5241317 | Howard | Aug 1993 | A |
5246232 | Eccher et al. | Sep 1993 | A |
5290037 | Witler et al. | Mar 1994 | A |
5319373 | Maxwell et al. | Jun 1994 | A |
5341142 | Reis et al. | Aug 1994 | A |
5342051 | Rankin et al. | Aug 1994 | A |
5357255 | Giraudy | Oct 1994 | A |
5375832 | Witler et al. | Dec 1994 | A |
5401026 | Eccher et al. | Mar 1995 | A |
5404144 | Vlannes | Apr 1995 | A |
5406290 | James et al. | Apr 1995 | A |
5413345 | Nauck | May 1995 | A |
5486002 | Witler et al. | Jan 1996 | A |
5489099 | Rankin et al. | Feb 1996 | A |
5495249 | Chazelle et al. | Feb 1996 | A |
5504312 | Morrison et al. | Apr 1996 | A |
5564698 | Honey et al. | Oct 1996 | A |
5609534 | Gebhardt et al. | Mar 1997 | A |
5631654 | Karr | May 1997 | A |
5652588 | Miron | Jul 1997 | A |
5657027 | Guymon | Aug 1997 | A |
5700204 | Teder | Dec 1997 | A |
5768151 | Lowy et al. | Jun 1998 | A |
5781505 | Rowland | Jul 1998 | A |
5796474 | Squire et al. | Aug 1998 | A |
5803823 | Gobush et al. | Sep 1998 | A |
5846139 | Bair et al. | Dec 1998 | A |
5862517 | Honey et al. | Jan 1999 | A |
5868578 | Baum | Feb 1999 | A |
5873040 | Dunn et al. | Feb 1999 | A |
5879246 | Gebhardt et al. | Mar 1999 | A |
5912700 | Honey et al. | Jun 1999 | A |
5917553 | Honey et al. | Jun 1999 | A |
5952957 | Szu | Sep 1999 | A |
5953077 | Honey et al. | Sep 1999 | A |
6042492 | Baum | Mar 2000 | A |
6057915 | Squire et al. | May 2000 | A |
6067039 | Pyner et al. | May 2000 | A |
6133946 | Cavallaro et al. | Oct 2000 | A |
6141060 | Honey et al. | Oct 2000 | A |
6154250 | Honey et al. | Nov 2000 | A |
6167355 | Fiekowsky | Dec 2000 | A |
6179720 | Rankin et al. | Jan 2001 | B1 |
6198501 | Nemiroff et al. | Mar 2001 | B1 |
6229550 | Gloudemans et al. | May 2001 | B1 |
6239747 | Kaminski | May 2001 | B1 |
6244971 | Mihran | Jun 2001 | B1 |
6252632 | Cavallaro | Jun 2001 | B1 |
6266005 | Schneider | Jul 2001 | B1 |
6266100 | Gloudemans et al. | Jul 2001 | B1 |
6292130 | Cavallaro et al. | Sep 2001 | B1 |
6304665 | Cavallaro et al. | Oct 2001 | B1 |
6320173 | Vock et al. | Nov 2001 | B1 |
6371862 | Reda | Apr 2002 | B1 |
6400306 | Nohara et al. | Jun 2002 | B1 |
6421116 | Schilli et al. | Jul 2002 | B1 |
6450442 | Schneider et al. | Sep 2002 | B1 |
6456232 | Milnes et al. | Sep 2002 | B1 |
6466275 | Honey et al. | Oct 2002 | B1 |
6520864 | Wilk | Feb 2003 | B1 |
6547671 | Mihran | Apr 2003 | B1 |
6592465 | Lutz | Jul 2003 | B2 |
6597406 | Gloudemans et al. | Jul 2003 | B2 |
6621561 | Holton | Sep 2003 | B2 |
6657584 | Cavallaro et al. | Dec 2003 | B2 |
6728637 | Ford et al. | Apr 2004 | B2 |
6744403 | Milnes et al. | Jun 2004 | B2 |
6764412 | Gobush et al. | Jul 2004 | B2 |
6774932 | Ewing et al. | Aug 2004 | B1 |
6778148 | Pack et al. | Aug 2004 | B1 |
6791217 | Collier-Hallman et al. | Sep 2004 | B2 |
6816185 | Harmath | Nov 2004 | B2 |
6864886 | Cavallaro et al. | Mar 2005 | B1 |
6903676 | Frady | Jun 2005 | B1 |
6909438 | White et al. | Jun 2005 | B1 |
6956523 | Mohan | Oct 2005 | B2 |
6965397 | Honey et al. | Nov 2005 | B1 |
6989789 | Ferreol et al. | Jan 2006 | B2 |
7026990 | Cooper et al. | Apr 2006 | B2 |
7031873 | Song | Apr 2006 | B2 |
7075556 | Meier et al. | Jul 2006 | B1 |
7116342 | Dengler et al. | Oct 2006 | B2 |
7132975 | Fullerton et al. | Nov 2006 | B2 |
7133801 | Song | Nov 2006 | B2 |
7154540 | Honey et al. | Dec 2006 | B2 |
7161733 | Fukata et al. | Jan 2007 | B2 |
7183966 | Schramek et al. | Feb 2007 | B1 |
7213442 | Workman | May 2007 | B2 |
7221794 | Gloudemans, II et al. | May 2007 | B1 |
7321330 | Sajima | Jan 2008 | B2 |
7333047 | Fullerton et al. | Feb 2008 | B2 |
7341530 | Cavallaro et al. | Mar 2008 | B2 |
7492363 | Meier et al. | Feb 2009 | B2 |
7497780 | Kiraly | Mar 2009 | B2 |
7680301 | Pendleton et al. | Mar 2010 | B2 |
7750901 | Meier et al. | Jul 2010 | B2 |
7822229 | Pendleton et al. | Oct 2010 | B2 |
7868914 | Dengler et al. | Jan 2011 | B2 |
7894669 | Gloudemans, II et al. | Feb 2011 | B2 |
7928976 | Meier et al. | Apr 2011 | B2 |
8016653 | Pendleton et al. | Sep 2011 | B2 |
8049750 | Gloudemans et al. | Nov 2011 | B2 |
8054216 | Kinoshita et al. | Nov 2011 | B2 |
8073190 | Gloudemans et al. | Dec 2011 | B2 |
8077917 | Forsgren | Dec 2011 | B2 |
8077981 | Elangovan et al. | Dec 2011 | B2 |
8085188 | Tuxen | Dec 2011 | B2 |
8149156 | Allred et al. | Apr 2012 | B1 |
8154633 | Gloudemans et al. | Apr 2012 | B2 |
8189857 | Johnson et al. | May 2012 | B2 |
8253799 | Elangovan et al. | Aug 2012 | B2 |
8335345 | White et al. | Dec 2012 | B2 |
8385658 | Elangovan et al. | Feb 2013 | B2 |
8400346 | Hubbard et al. | Mar 2013 | B2 |
8401304 | Cavallaro et al. | Mar 2013 | B2 |
8451265 | Gloudemans et al. | May 2013 | B2 |
8456526 | Gloudemans et al. | Jun 2013 | B2 |
8456527 | Elangovan et al. | Jun 2013 | B2 |
8457392 | Cavallaro et al. | Jun 2013 | B2 |
8461965 | Chen et al. | Jun 2013 | B2 |
8466913 | Gloudemans et al. | Jun 2013 | B2 |
8441476 | Gloudemans et al. | Jul 2013 | B2 |
8558883 | Cavallaro et al. | Oct 2013 | B2 |
8659663 | Elangovan et al. | Feb 2014 | B2 |
8665153 | Nakagawa et al. | Mar 2014 | B2 |
8705799 | White et al. | Apr 2014 | B2 |
8786415 | Cavallaro et al. | Jul 2014 | B2 |
8845442 | Tuxen | Sep 2014 | B2 |
8866665 | Suzuki | Oct 2014 | B2 |
8884741 | Cavallaro et al. | Nov 2014 | B2 |
8912945 | Tuxen | Dec 2014 | B2 |
8977585 | Cavallaro et al. | Mar 2015 | B2 |
9007463 | Elangovan et al. | Apr 2015 | B2 |
9024810 | Lohbihler | May 2015 | B2 |
9036864 | Johnson et al. | May 2015 | B2 |
9041722 | Gloudemans et al. | May 2015 | B2 |
9215383 | Milnes et al. | Dec 2015 | B2 |
9473748 | Elangovan et al. | Oct 2016 | B2 |
9500743 | Reid et al. | Nov 2016 | B2 |
9555284 | Vollbrecht et al. | Jan 2017 | B2 |
9625321 | Cavallaro et al. | Apr 2017 | B2 |
9645235 | Tuxen | May 2017 | B2 |
9905082 | Dengler et al. | Feb 2018 | B2 |
20020075475 | Holton | Jun 2002 | A1 |
20020107078 | Collins | Aug 2002 | A1 |
20020114493 | McNitt et al. | Aug 2002 | A1 |
20030027655 | Lutz et al. | Feb 2003 | A1 |
20030076255 | Ono | Apr 2003 | A1 |
20030103684 | Gobush et al. | Jun 2003 | A1 |
20040032363 | Schantz et al. | Feb 2004 | A1 |
20040032970 | Kiraly | Feb 2004 | A1 |
20040156035 | Rogers | Aug 2004 | A1 |
20040178945 | Buchanan | Sep 2004 | A1 |
20040248662 | Gobush et al. | Dec 2004 | A1 |
20050030222 | Steudel | Feb 2005 | A1 |
20050030333 | Takahashi et al. | Feb 2005 | A1 |
20060092075 | Bruce et al. | May 2006 | A1 |
20060164439 | Dengler et al. | Jul 2006 | A1 |
20060169932 | Fukata et al. | Aug 2006 | A1 |
20070167247 | Lindsay | Jul 2007 | A1 |
20070291987 | Saka | Dec 2007 | A1 |
20070293331 | Tuxen | Dec 2007 | A1 |
20080018519 | Berg et al. | Jan 2008 | A1 |
20080021651 | Seeley et al. | Jan 2008 | A1 |
20080048907 | Matsuura et al. | Feb 2008 | A1 |
20080068463 | Claveau et al. | Mar 2008 | A1 |
20080139330 | Tuxen | Jun 2008 | A1 |
20080199043 | Forsgren | Aug 2008 | A1 |
20080261711 | Tuxen | Oct 2008 | A1 |
20090295624 | Tuxen | Dec 2009 | A1 |
20100029415 | Lindsay | Feb 2010 | A1 |
20110250939 | Hobler | Oct 2011 | A1 |
20110286632 | Tuxen | Nov 2011 | A1 |
20110304497 | Molyneux et al. | Dec 2011 | A1 |
20120101711 | Furmston | Apr 2012 | A1 |
20130039538 | Johnson | Feb 2013 | A1 |
20130271323 | Joo et al. | Oct 2013 | A1 |
20130274025 | Luciano, Jr. et al. | Oct 2013 | A1 |
20130346009 | Winter et al. | Dec 2013 | A1 |
20140191896 | Johnson et al. | Jul 2014 | A1 |
20140347212 | Tuxen | Nov 2014 | A1 |
20150234045 | Rosenblum et al. | Aug 2015 | A1 |
20160047889 | Takahashi et al. | Feb 2016 | A1 |
20160162160 | Dengler et al. | Jun 2016 | A1 |
20160170015 | Tuxen | Jun 2016 | A1 |
20160202353 | Saegusa et al. | Jul 2016 | A1 |
20160243423 | Tuxen et al. | Aug 2016 | A1 |
20160247292 | Tuxen et al. | Aug 2016 | A1 |
20160292865 | Floor et al. | Oct 2016 | A1 |
20160306035 | Johnson | Oct 2016 | A1 |
20160306036 | Johnson | Oct 2016 | A1 |
20160306037 | Johnson | Oct 2016 | A1 |
20160307335 | Perry et al. | Oct 2016 | A1 |
20160313441 | Tuxen | Oct 2016 | A1 |
20160320476 | Johnson | Nov 2016 | A1 |
20160327642 | Saegusa et al. | Nov 2016 | A1 |
20160339320 | Johnson et al. | Nov 2016 | A1 |
20170259115 | Hall | Sep 2017 | A1 |
20170270354 | Painter | Sep 2017 | A1 |
20170333777 | Spivak et al. | Nov 2017 | A1 |
20180005492 | Hall | Jan 2018 | A1 |
20180011184 | Du Toit et al. | Jan 2018 | A1 |
20180175495 | Bennett et al. | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
2620991 | Mar 1977 | DE |
102005046085 | Mar 2007 | DE |
0116183 | Dec 1983 | EP |
0116183 | Aug 1984 | EP |
0529489 | Mar 1993 | EP |
1158270 | Nov 2001 | EP |
2283144 | Apr 1995 | GB |
2294403 | May 1996 | GB |
2319834 | Jun 1998 | GB |
2380682 | Apr 2003 | GB |
S59137873 | Aug 1984 | JP |
H06126015 | May 1994 | JP |
H06213989 | Aug 1994 | JP |
H08266701 | Oct 1996 | JP |
H10170646 | Jun 1998 | JP |
2000230974 | Aug 2000 | JP |
2001305528 | Oct 2001 | JP |
2003098255 | Apr 2003 | JP |
3870233 | Jan 2007 | JP |
2007163321 | Jun 2007 | JP |
2008249354 | Oct 2008 | JP |
4388639 | Dec 2009 | JP |
1990008936 | Aug 1990 | WO |
1991006348 | May 1991 | WO |
1993004382 | Mar 1993 | WO |
1999027384 | Jun 1999 | WO |
2000062090 | Oct 2000 | WO |
2002025303 | Mar 2002 | WO |
2003005281 | Jan 2003 | WO |
2003032006 | Apr 2003 | WO |
2004031680 | Apr 2004 | WO |
2005017553 | Feb 2005 | WO |
2005081014 | Sep 2005 | WO |
2005116678 | Dec 2005 | WO |
2006002639 | Jan 2006 | WO |
2006002640 | Jan 2006 | WO |
2008038005 | Apr 2008 | WO |
2010086414 | Aug 2010 | WO |
2010125790 | Nov 2010 | WO |
2011092813 | Aug 2011 | WO |
2016036351 | Mar 2016 | WO |
2016110757 | Jul 2016 | WO |
2018138708 | Aug 2018 | WO |
Entry |
---|
Ruoyo et al., “Radar Reflected Signal Process of High Spinning Rate Projectiles”, The Eighth International Conference on Electronic Measurement and Instruments, 2007, pp. 3-982-3-985. |
Wei et al., “A New Method for Spin Estimation Using the Data of Doppler Radar”, ICSP, 2000, pp. 1911-1914. |
Masuda et al., “Measurement of Initial Conditions of a Flying Golf Ball”, WEAM 4-4, 1994, pp. 344-347. |
Christensen et al., “Doppler-Surface Mapping Technique for Characterisation of Spinning Cylinders Illuminated by Radar”, A&E Systems Magazine, Aug. 2005, pp. 19-24. |
Ong et al., “Signal-Adapted Wavelets for Doppler Radar System”, Seventh Annual Conference on Control, Dec. 2002, pp. 19-23. |
“Technology”, Zelocity, Golf Performance Monitors, Copyright 2004-2006, Retrieved Jun. 9, 2010. |
“3D Doppler Ball Tracking Monitors, Golf Radars and Launch Monitors”, Flightscope, Copyright 2009, Retrieved Jun. 9, 2010. |
Lolck, “TERMA Elektronik AS: Doppler Radar Spin Measurement”, Brochure, 1986. |
“DR 5000 User's Guide: DR 5000 Spin Calculation”, DR 5000 User's Guide, 2004, pp. 27-45 and 48-59. |
Bosse et al., “Improved radar tracking using a multipath model: maximum likelihood compared with eigenvector analysis”, IEEE Proc. Radar, Sonar Navig., Aug. 1994, vol. 141, No. 4, pp. 213-222. |
Blackaby, “Simultaneous RF/EO tracking and characterization of dismounts”, MSc Thesis, 2008, 52 Sheets. |
Fasano et al., “Radar/electro-optical data fusion for non-cooperative UAS sense and avoid”, Aerospace Science and Technology, 2015, vol. 46, pp. 436-450. |
Warthman; Technical Note D-1138: Project Echo—Boresight Cameras for recording antenna point accuracy; NASA, Sep. 1961, 14 sheets. |
Number | Date | Country | |
---|---|---|---|
20180156914 A1 | Jun 2018 | US |