The present disclosure generally relates to the field of obstacle avoidance technology and more particularly, but not exclusively, to a system and a method for radar-based obstacle avoidance for unmanned aerial vehicles (UAVs).
Unmanned aerial vehicles (UAVs) have great potential to be widely used in both civil and military applications. Because of their low cost, safety benefit and mobility, UAVs may potentially replace manned aerial vehicles in many tasks as well as perform well in curriculums that tradition manned aerial vehicles do not. However, as there is no on-board human control, UAVs' usage encounters several challenges that need to be overcome, one of which is obstacle avoidance. UAVs need to avoid collision with both static and moving obstacles.
Existing obstacle avoidance technologies include ultrasonic obstacle avoidance, visual obstacle avoidance, and time-of-flight (TOF) obstacle avoidance, etc. One main drawback of ultrasonic obstacle avoidance is the substantially short detection distance, and some objects capable of absorbing ultrasound (such as carpet) may not be accurately detected. TOF obstacle avoidance has poor anti-interference capabilities, and its detection distance is often limited to about ten meters.
Visual obstacle avoidance, which requires the obstacles to have certain texture information for feature matching, is greatly affected by weather, and its detection distance is often limited to about twenty meters. Obstacles without plenty of texture information, e.g., linear obstacle or planar obstacle having undistinguished surface texture, such as glass, wire mesh, may be difficult to be detected. Moreover, when the relative speed of an obstacle target and the UAV is relatively large, the obstacle detection performance of visual obstacle avoidance may be significantly degraded.
The disclosed system and method for radar-based obstacle avoidance for UAVs thereof are directed to solve one or more problems set forth above and other problems.
One aspect of the present disclosure provides a method for radar-based object avoidance for a movable platform. The method includes performing a plurality of “ping-pong” measurements to receive electromagnetic signals corresponding to an object and background clutters by a radar of the movable platform, and distinguishing the object from the background clutters. Each of the “ping-pong” measurements includes a first measurement and a second measurement. A first direction of the first measurement is different from a second direction of the second measurement.
Another aspect of the present disclosure provides a system for radar-based object avoidance for a movable platform. The system includes a radar and a radar data processing unit. The radar is configured to perform a plurality of “ping-pong” measurements of to receive electromagnetic signals corresponding to an object and background clutters. The radar data processing unit is configured to distinguish the object from the background clutters. Each of the “ping-pong” measurements includes a first measurement and a second measurement. A first direction of the first measurement is different from a second direction of the second measurement.
Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure.
The following drawings are merely examples for illustrative purposes according to various disclosed embodiments and are not intended to limit the scope of the present disclosure.
Reference will now be made in detail to exemplary embodiments of the disclosure and the accompanying drawings. Hereinafter, embodiments consistent with the disclosure will be described with reference to the drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. The described embodiments are some but not all of the embodiments of the present disclosure. Based on the disclosed embodiments, persons of ordinary skill in the art may derive other embodiments consistent with the present disclosure, all of which are within the scope of the present disclosure. Further, in the present disclosure, the disclosed embodiments and the features of the disclosed embodiments may be combined under conditions without conflicts.
The present disclosure provides a system and a method for radar-based object avoidance for a movable platform. The radar-based object avoidance system and method can be based on a radio frequency (RF) radar, such as a microwave radar, which is usually not affected by sunlight, smoke, fog, dust, or other factors that typically affect optical waves, and typically has improved directionality and range characteristics when compared with acoustic systems. The RF radar can detect an object or multiple objects, such as an obstacle or multiple obstacles, within a distance from about one meter to several hundred meters, including linear and planar objects, such as branches, cables, and barbed wires, etc., and can acquire various information of both static and moving objects, e.g., aircrafts in a multiple-aircraft formation flight.
The main body 102 constitutes a housing for accommodating various components of the movable platform 100, such as, for example, a control system (which may include the radar-based object avoidance system 104, as described below), one or more inertial measuring units (IMUs), one or more processors, one or more power sources, and/or other sensors.
The radar-based object avoidance system 104 includes a radar 104-2 and a radar data processing unit 104-4. The radar 104-2 can be directly mounted on the main body 102 of the movable platform 100. For example, the radar 104-2 can be mounted on the front, the back, the left, or the right of the main body 102. Further, the radar 104-2 can be mounted on any appropriate portion of the main body 102 through any appropriate mechanism, as long as such a mounting allows the radar 104-2 to efficiently transmit electromagnetic (EM) waves and receive reflected EM waves from object(s) in the path of the transmitted EM waves. In some embodiments, the radar 104-2 may be especially adapted for use in the movable platform 100. For example, the radar 104-2 may be power efficient, lightweight, and compact to avoid over-encumbering the movable platform 100.
Consistent with the disclosure, the radar 104-2 can include one or more transmitter producing the EM waves, e.g., in the radio frequency domain, one or more emitting antennas or antenna arrays to emit the EM waves (also referred to as emitted signal), and one or more receiving antennas or antenna arrays (different from or the same as the emitting antennas or antenna arrays) to capture any returns from the object(s). According to the returns, the radar 104-2 can generate radar data and send the radar data to the radar data processing unit 104-4, which can, for example, process the radar data to determine properties or information of the object(s), also referred to as “object information,” as discussed in more detail below. For example, one or more data processors in the radar data processing unit 104-4 can be configured to execute a method consistent with the disclosure, such as one of the exemplary methods described below, to process the radar data.
The IMU 206 can include, for example, one or more accelerometers, one or more gyroscopes, and/or one or more magnetometers. The IMU 206 can detect acceleration information of the movable platform 100, such as, for example, a linear acceleration and/or changes in rotational attributes (such as pitch, roll, and yaw) of the movable platform 100. In some embodiments, the acceleration information may additionally or alternatively include a centripetal acceleration, which can be used to determine an angular velocity of the movable platform 100. The IMU 206 sends the acceleration information to the radar data processing unit 104-4 and/or the movement control unit 208. The movement control unit 208 sends movement control information to the radar data processing unit 104-4. The movement control information can include, for example, coordinates (latitude, longitude, and elevation) of the movable platform 100 in the geographic coordinate system, a velocity of the movable platform 100, and/or an attitude of the movable platform 100 (if the movable platform 100 is an aircraft). The radar data processing unit 104-4 can process the radar data, the acceleration information, and/or the movement control information to generate the object information and send the object information to the movement control unit 208 for controlling the movable platform 100 to avoid the object(s).
In some embodiments, as shown in
In some embodiments, the radar 104-2 includes a first emitting antenna or antenna array for emitting a first EM beam expanding in a range of angles in a first direction and a second emitting antenna or antenna array for emitting a second EM beam expanding in a range of angles in a second direction different from the first direction. The first and second directions, also referred to as first and second measurement directions or first and second scanning directions, are different from each other, and can, for example, be approximately perpendicular to each other. It is noted, however, that the terms “first” and “second” do not imply any order, such as the order in which the first and second EM beams are emitted. For example, the second beam can be emitted after the first beam is emitted, the first beam can be emitted after the second beam is emitted, or the first and second beams can be emitted simultaneously.
In some embodiments, the first direction includes a horizontal direction and the second direction includes a vertical direction. The horizontal direction and the vertical direction can be defined, for example, with respect to the ground. That is, the horizontal direction is parallel to the ground and the vertical direction is perpendicular to the ground. Alternatively, the horizontal direction and the vertical direction can be defined with respect to a plane on the movable platform, such as an upper surface of the movable platform. That is, the horizontal direction is parallel to the upper surface of the movable platform and the vertical direction is perpendicular to the upper surface of the movable platform.
For example, as shown in
In some embodiments, the radar 104-2 includes a first receiving antenna array for detecting part of the first EM beam 302 that is reflected, e.g., by an object 306, also referred to as “first reflected EM signals.” The radar 104-2 further includes a second receiving antenna array for detecting part of the second EM beam 304 that is reflected, e.g., by the object 306, also referred to as “second reflected EM signals.” In some embodiments, the first and second emitting antenna arrays can also serve as the first and second receiving antenna arrays, respectively. Hereinafter, the first receiving antenna array and the second receiving antenna array are referred to for the purposes of description. It is noted, however, that the terms “first receiving antenna array” and “second receiving antenna array” may also refer to the first emitting antenna array and the second emitting antenna array, respectively, when they also function as receiving antenna arrays.
Using the radar 104-2 described above, the movable platform 100 can detect, or measure, an object, e.g., an obstacle, such as the object 306 shown in
The first measurement will now be described in more detail. The second measurement is essentially similar to the first measurement, but merely in a different direction, and thus detailed description thereof is omitted.
According to the disclosure, the radar 104-2 emits the first EM beam 302, detects the first reflected EM signals, generates the first radar data, and sends the first radar data to the radar data processing unit 104-4 for processing. In some embodiments, the first reflected EM signals may include EM signals reflected by the object 306, which constitutes the useful signal data, and EM signals unrelated to the object 306, i.e., background clutters. The background clutters can initiate from various sources. For example, part of the background clutters may be caused by the radar based object avoidance system 104 itself, e.g., the intrinsic noise of the radar based object avoidance system 104. Further, part of the background clutters may result from the environment of the object 306. The first reflected EM signals need to be processed to distinguish the object 306 from the background clutters.
Various factors and approaches can be used to distinguish the object 306 from the background clutters. In some embodiments, the radar data processing unit 104-4 processes the first radar data to calculate a set of parameters associated with each of the first reflected EM signals. Each of the first reflected EM signals represents one potential candidate, which may be the object 306 or the background clutters. The parameters can include, for example, a candidate range, a candidate relative velocity, and a candidate signal strength. Consistent with the disclosure, one or more of the candidate range, the candidate relative velocity, or the candidate signal strength can be used to distinguish the object 306 from the background clutters, and the object direction angles can then be determined for the distinguished object 306. In some embodiments, direction angles can be determined for all of the potential candidates and then used to distinguish the object 306 from the background clutters. However, this approach may require a higher amount of computation capability and a longer computation time.
In some embodiments, the candidate range of a potential candidate can be calculated using
r=c·t/2 (1)
where r denotes the candidate range, c denotes the speed of light in vacuum (approximately 3.0×108 m/s), t denotes the period of time between the time when the first EM beam is transmitted by the radar 104-2 and the time when the first reflected EM signal is received by the radar 104-2. Further, the candidate relative velocity of the potential candidate can be calculated using, e.g., Doppler information from the relative movement between the potential candidate and the movable platform 100, such as the Doppler frequency shift. For example, a radial component of the candidate relative velocity, which is in the direction parallel to the line connecting the movable platform 100 and the potential candidate can be calculated using
v
r
=λ·f
D/2 (2)
where vr denotes the radial component of the candidate relative velocity, λ denotes a wavelength of the first EM beam, and fD denotes the Doppler frequency shift. The Doppler frequency shift fD can be positive or negative, depending on whether the potential candidate, such as the object 306, is moving toward or away from the movable platform 100. Correspondingly, the radial component of the candidate relative velocity can be positive or negative. Other components of the candidate relative velocity, such as an azimuth component vφ (in the horizontal direction) and an elevation component vθ (in the vertical direction) can be calculated according to angle estimation using one or more of, e.g., the radial component of the candidate relative velocity, the candidate range, and the candidate direction angles.
The calculated candidate ranges and candidate relative velocities (such as the radial components) can be used to screen the potential candidates to distinguish the object 306 from the clutters. An exemplary method is discussed below with reference to
Sometimes, two or more potential candidates, such as two or more objects, are located at a similar distance away from the movable platform 100, and thus may not be able to be distinguished from each other based on the candidate ranges. In the disclosure, a range resolution of the radar 104-2 is defined as the minimum separation (in range) of two candidates that can be resolved as separate candidates, which can be calculated using c/2B, where B denotes a bandwidth of the first EM beam. Thus, to improve the range resolution, a broad-band radar can be used as the radar 104-2. For example, when the bandwidth of the first EM beam is about 1 GHz, the range resolution of the radar 104-2 is about 0.15 m. That is, if the range difference between two potential candidates, even if they are in different directions, is smaller than about 0.15 m, the radar based object avoidance system 104 may not be able to distinguish the two potential candidates based on their ranges.
Usually, the signal strength associated with the object 306 is higher than the signal strength associated with the background clutters. Therefore, alternatively or in addition to using the candidate ranges and the candidate relative velocities, the candidate signal strengths can be used to screen the potential candidates to distinguish the object 306 from the background clutters. In some embodiments, a constant-false-alarm-rate (CFAR) detection algorithm may be adopted. The role of the CFAR algorithm is to determine a strength threshold above which a first reflected EM signal can be considered to probably originate from an object. The threshold can be set based on experience or statistic results. A lower threshold may ensure that more objects can be detected but the number of false alarms, i.e., a reflected EM signal being incorrectly identified as originated from an object, may increase. On the other hand, a higher threshold may reduce the number of false alarms but some object(s) may be missed. Consistent with the disclosure, the threshold can be set to achieve a required probability of false alarm (or equivalently, false alarm rate or time between false alarms).
In some embodiments, the background clutters against which the objects are to be detected is constant with time and space, and thus a fixed threshold may be chosen that provides a specified probability of false alarm, governed by a probability density function of the noise, which is usually assumed to be Gaussian. The probability of detection is then a function of the signal-to-noise ratio of the target return. In some embodiments, the noise level changes both spatially and temporally, such as when the movable platform 100 is moving. In these embodiments, a changing threshold may be used, where the threshold can be raised and lowered to maintain a constant probability of false alarm.
After the object 306 is distinguished from the background clutters, the first object direction angle, e.g., the horizontal angle for the distinguished object 306 can be calculated based on the first reflected EM signal associated with the object 306.
where θ denotes the first object direction angle of the object, Phase0 denotes a phase of the first reflected EM signal corresponding to the object when received by Channel 0, Phase1 denotes a phase of the first reflected EM signal corresponding to the object when received by Channel 1, and d denotes a distance between Channel 0 and Channel 1.
In the example shown in
given that θ∈[−θmax1, θmax1], where x0, x1, . . . , and x(N−1) respectively denote a phase of the first reflected EM signal corresponding to the object when received by Channel 0, Channel 1, . . . , and Channel (N−1), respectively. In this example, it is assumed that the channels are arranged at an equal interval d.
In some embodiments, the first receiving antenna array includes more than two channels but not all of the channels are used to calculate the first object direction angle of the object 306. In some embodiments, only two of the more than two channels are used in the calculation, and equation (3) can be used.
According to the disclosure, the second measurement can be performed in a manner similar to that described above for the first measurement, except that all direction-specific parameters in the first measurement can be replaced with the direction-specific parameters in the second measurement. Within one “ping-pong” measurement, the first and second measurements can be performed in any order, i.e., the first measurement can be performed before or after the second measurement, or the first and second measurement can be performed approximately simultaneously.
In some embodiments, a non-direction-specific parameter, such as the candidate range, the candidate relative velocity, or the candidate signal strength, can be calculated in both the first and second measurements, and an average of the calculation results from the two measurements can be used as the final value of the parameter. Further, as discussed above, the period of time for the movable platform 100 to perform the first and second measurements is usually relative short, such as several milliseconds. Therefore, in some embodiments, either one of the calculation results from the two measurements can be used as the final value of the parameter.
Consistent with the disclosure, the object information obtained from one “ping-pong” measurement, including one first measurement and one second measurement, forms one “ping-pong”measurement frame, also referred to as a “measurement frame.” Each measurement frame contains the object information of one or more objects. By performing a plurality of “ping-pong” measurements, the radar data processing unit 104-4 can obtain a plurality of measurement frames. In some embodiments, to track and predict the one or more objects, the object information of each object in two measurement frames, such as two adjacent measurement frames, can be matched and correlated to establish a corresponding relationship of each object between the two measurement frames. When the measurement frames contain object information of multiple objects, the multiple objects can be numbered according to the corresponding relationships.
For example, assume n1 objects are detected in a current measurement frame and assume n2 objects are detected in a previous measurement frame, denoted by Tq, q∈[1, n2], then matching and correlating the object information of the objects in the two measurement frames includes calculating matching probabilities, also referred to as “threshold-association probabilities,” of the objects in the current measurement frame being the objects in the previous measurement frame. In some embodiments, for an object Tq in the previous measurement frame, a threshold area, also referred to as a “gate,” can be determined according to a range threshold between the movable platform 100 and the object Tq, a relative velocity threshold between the movable platform 100 and the object Tq, direction angle zone thresholds (including a first direction angle zone threshold and a second direction angle zone threshold) between the movable platform 100 and the object Tq, and statistic data of the n2 objects. In some embodiments, the threshold area can be determined based on a detection probability (described later) according to the range threshold, the relative velocity threshold, and the direction angle zone thresholds and may vary slightly according to results from test experiments. The statistic data can include, for example, an average value, a standard deviation, and/or a Mahalanobis distance of each of the object ranges, the object relative velocities, the first object direction angles, and the second object direction angles of n2 objects. In some embodiments, the gate can be centered at the object Tq and the threshold area, i.e., the gate, can be an area surrounding the object Tq.
After the threshold area, i.e., the gate, is determined, the number, L, of objects in the current measurement frame that falls in the gate is then determined. L=0 means no object in the current measurement frame matches the object Tq in the previous measurement frame, i.e., the matching probabilities for Tq are 0. L=1 means one object in the current measurement frame falls in the gate and that one object can be considered as perfectly matching the object Tq, i.e., the matching probability of Tq in the previous measurement frame and that one object in the current measurement frame is 100%. This means that the object Tq and that one object are the same object, which is also referred to as a “matched object.” Further, if L>1, then multiple objects, denoted by Mp, p∈[1, L], in the current measurement frame may possibly match the object Tq in the previous measurement frame, and thus the probability of each of these multiple objects in the current measurement frame matches the object Tq in the previous measurement frame can be calculated to determine which one of these multiple objects most likely matches the object Tq. In some embodiments, it is assumed that the distribution of the objects in the current measurement frame satisfies the Gaussian distribution. The probability of an object Mp that falls in the gate matches the object Tq, i.e., P(Tq|Mp), can be calculated using the following equation:
where γ denotes the density of the background clutters, V denotes the volume of the gate, S denotes the variance of the L objects in the current measurement frame that fall in the gate, which can be, for example, a sum of a range variance, a velocity variance, and an angle variance of the L objects, PD denotes the detection probability, and PG denotes the probability of the object Mp in the current measurement frame correctly falls in the gate. The detection probability PD can usually be set as 1 (one) assuming target(s) in the gate will be tracked. In some embodiments, the detection probability PD can be set to have a value near to but smaller than 1 (one) because sometimes the target(s) cannot be tracked for reasons such as device breakdown. After the matching probabilities are calculated, the object Mp that has the highest matching probability can be determined as the object in the current measurement frame that matches the object Tq in the previous measurement frame, i.e., they are the same object (matched object). In some embodiments, the object information of the matched object can be smoothed, e.g., using a filter, to further reduce the noise and improve the signal-to-noise ratio. When multiple objects are matched, the matched objects can be numbered.
In some embodiments, the radar data processing unit 104-4 can obtain movement control information such as, for example, coordinates (latitude, longitude, and elevation) of the movable platform 100 in the geographic coordinate system, a velocity of the movable platform 100, and/or acceleration information of the movable platform 100. In some embodiments, the movable platform 100 is an aircraft, such as an UAV, and the movement control information can further include an attitude of the movable platform 100. In some embodiments, the radar data processing unit 104-4 can obtain the acceleration from the IMU 206 and other movement control information from the movement control unit 208. In some embodiments, the radar data processing unit 104-4 can obtain the movement control information from the movement control unit 208. Based on the object information of the matched object and the movement control information, the radar data processing unit 104-4 can track the matched object and predict future object information of the matched object.
Specifically, based on the predicted object information, the radar data processing unit 104-4 can determine a real-time motion model of the movable platform 100, which may include at least one of a uniform motion model corresponding to a zero acceleration, a uniformly accelerated motion model corresponding to a uniform acceleration, or a nonuniformly accelerated motion model corresponding to a nonuniform acceleration. The different motion models can be pre-built and the radar data processing unit 104-4 can choose one or more appropriate models for the purpose of tracking the matched object. Then, based on the real-time motion model of the movable platform 100, the radar data processing unit 104-4 can apply a predetermined filtering algorithm to the object information of the matched object, to predict future object information of the matched object. The predetermined filtering algorithm may include but is not limited to the Kalman filtering algorithm or the particle filtering algorithm.
Kalman filtering algorithm has been widely adopted to track and estimate state of a system and the variance or uncertainty of the estimate. The estimate is updated using a state transition model and measurements. In some embodiments, the state transition model may be determined in real time according to the real-time motion model of the movable platform 100. Thus, the predication accuracy of the future object information of the matched object may be improved.
Based on the predicted future object information of the matched object, the movement control unit 208 can calculate a movement plan for the movable platform 100 to avoid the matched object. In some embodiments, based on the predicted future object information of the matched object, the radar data processing unit 104-4 can obtain position information and relative velocity information of the matched object in a spherical coordinate system, in which the movable platform 100 is the origin. The position information and the relative velocity information of the matched object in the spherical coordinate system can be expressed as (r, θ, φ) and (vr, vθ, vp), respectively, where r denotes the radial distance, θ denotes the polar angle, and φ denotes the azimuth angle.
The radar data processing unit 104-4 then converts the position information and relative velocity information of the matched object from the spherical coordinate system to a Cartesian coordinate system, in which the movable platform 100 is the origin, based on a conversion relationship between the spherical coordinate system and the Cartesian coordinate system. The position information and the relative velocity information of the matched object in the Cartesian coordinate system may be expressed as (x, y, z) and (vx, vy, vz), respectively, where vx denotes the component of the relative velocity in the x-direction, vy denotes the component of the relative velocity in the y-direction, and vz denotes the component of the relative velocity in the z-direction.
The position information and relative velocity information of an object in the Cartesian coordinate system are also collectively referred to as three-dimensional (3D) depth information of the object. Based on the predicted future object information of the matched object, the 3D depth information of the matched object in front of the movable platform 100 can be obtained in real time. Given that the motion mode of the movable platform 100 remains the same, the radar data processing unit 104-4 can calculate a time when the movable platform 100 and the matched object will collide, based on the position information and relative velocity information of the matched object in the Cartesian coordinate system.
Based on the time when the movable platform 100 and the matched object will collide, the position information and relative velocity information of the matched object in the Cartesian coordinate system, and the movement control information, the movement control unit 208 can calculate the movement plan to avoid the matched object, and operate the movable platform 100 according to the movement plan.
In some embodiments, the movement plan can include adding a superimposition velocity onto a current velocity of the movable platform 100, i.e., superimposing a maneuvering velocity onto the current velocity of the movable platform 100.
As discussed above, a movable platform consistent with the disclosure can include an UAV.
The fuselage 702 constitutes a housing for accommodating various components of the UAV 700, such as, for example, a control system (which may include the radar-based object avoidance system 704), one or more inertial measuring units (IMUs), one or more processors, one or more power sources, and/or other sensors. The rotors 706 can be connected to the fuselage 702 via one or more arms or extensions that can branch from edges or a central portion of the fuselage 702, and can be mounted at or near the ends of the arms. The rotors 706 are configured to generate lift for the UAV 700, and serve as propulsion units that can enable the UAV 700 to move about freely in the air.
The radar-based object avoidance system 704 is similar to the radar-based object avoidance system 104 shown in
In some embodiments, as shown in
In some embodiments, as shown in
The UAV 700 shown in
The present disclosure also provides a method for radar-based object avoidance for a movable platform.
As shown in
At 804, the received EM signals are filtered to obtain EM signals corresponding to the object. In some embodiments, the received EM signals can be filtered according to at least one of range information calculated based on the EM signals, relative velocity information calculated based on the EM signals, or a CFAR detection algorithm.
At 806, object information of the object is obtained for each of the “ping-pong” measurements. In some embodiments, the object information may include an object range, an object relative velocity, and object direction angles, which forms one “ping-pong” measurement frame, as discussed above.
At 808, the object information in two “ping-pong” measurement frames are matched and correlated to establish a corresponding relationship for the object between the two “ping-pong” measurement frames. In the embodiments where multiple objects are identified in each of the two “ping-pong” measurement frame, the objects are numbered.
At 904, a distance threshold between the movable platform and the object, a relative velocity threshold between the platform and the object, and direction angle zone thresholds of the object with respect to the movable platform are determined for the previous “ping-pong” measurement frame.
At 906, based on the distance threshold, the relative velocity threshold, the direction angle zone thresholds, and statistic data of the object, a threshold area of the object is determined.
At 908, a matching probability of the object is calculated.
At 910, the object in the previous “ping-pong” measurement frame and the object in the current “ping-pong” measurement frame are matched according to the matching probability. That is, the object in the previous “ping-pong” measurement frame and the object in the current “ping-pong” measurement frame are the same object, which is also referred to as a “matched object.”
Referring again to
At 812, future object information of the matched object is predicted based on the movement control information and the object information of the matched object.
Referring again to
The description of the disclosed embodiments is provided to illustrate, rather than limiting, the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
This application is a continuation of International Application No. PCT/CN2017/072451, filed on Jan. 24, 2017, the entire content of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2017/072451 | Jan 2017 | US |
Child | 16518655 | US |