This application relates to the field of communications technologies, and in particular, to a method and apparatus for identifying behavior of a target, and a radar system.
As a new technology emerging in recent years, human behavior identification has attracted extensive attention in fields such as video surveillance, automated driving, intelligent human-computer interaction, and intelligent traffic early warning.
Commonly used human behavior identification technologies are implemented based on image acquisition and image processing. That is, images of pedestrians are captured by an image capturing device, and target identification is performed on the captured images to obtain a target image of each pedestrian, and feature extraction is performed on each target image based on a feature extraction algorithm, to obtain feature data corresponding to the pedestrian. Finally, the obtained feature data corresponding to each pedestrian is input to a pre-trained behavior identification model, and an output of the behavior identification model is behavior information of the pedestrian, such as running or walking.
The foregoing human behavior identification technology based on image acquisition and image processing may be easily affected by light rays, the line of sight of the image capturing device, and the like, resulting in low identification accuracy.
To resolve the problem of low accuracy of behavior identification, embodiments of this application provide a method and apparatus for identifying behavior of a target, and a radar system. The technical solutions are as follows.
According to a first aspect, a method for identifying behavior of a target is provided, where the method includes receiving a radar echo signal reflected by a target, processing the radar echo signal to obtain time-frequency domain data, processing the time-frequency domain data to obtain signal attribute feature data and linear prediction coefficient (LPC) feature data, where the signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal, and inputting the signal attribute feature data and the LPC feature data into a behavior identification model, and outputting behavior information of the target.
According to the solution in this embodiment of this application, a transmit antenna of a radar system continuously transmits radar signals based on a pulse repetition period, that is, the transmit antenna of the radar system continuously transmits modulated radar signals at different frequencies in each pulse repetition period, where the pulse repetition period may also be referred to as a frequency sweep period. After a radar signal is reflected by the target, a receive antenna receives the radar signal, where the received radar signal is a radar echo signal. When processing radar echo signals, the radar system may process all radar echo signals that are received in a predetermined behavior identification period. One behavior identification period includes a predetermined quantity of pulse repetition periods.
After receiving the radar echo signal reflected by the target, the radar system mixes the received radar echo signal with the radar signal transmitted when the radar echo signal is received, to obtain a beat signal. Then, N-point sampling is performed on an analog signal of the beat signal, and the N-point sampled analog signal is converted into a digital signal using an analog-to-digital (A/D) converter. For each pulse repetition period, M-point fast Fourier transformation (FFT) may be performed on the beat signal after A/D conversion, that is, frequency domain data of M points can be obtained. The frequency domain data obtained for each point may be expressed in a form of a complex number. For the obtained frequency domain data of the M points, if an amplitude of a signal in a spectrum diagram corresponding to the frequency domain data is greater than a predetermined value, it is considered that the frequency domain data of the point is obtained from the radar echo signal reflected by the target. Then, the frequency domain data of the point may be used as the frequency domain data of the target. The frequency domain data in one behavior identification period is accumulated to obtain the frequency domain data of the target in one behavior identification period, where the frequency domain data of the target in one behavior identification period is a one-dimensional vector. Then, a short time Fourier transform (STFT) may be performed on the frequency domain data of the target in one behavior identification period to obtain time-frequency domain data of the target in one behavior identification period, where the time-frequency domain data of the target in one behavior identification period is a two-dimensional matrix.
Then, the signal attribute feature data and the LPC feature data of the target are calculated based on the time-frequency domain data of the target in one behavior identification period, where the signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal.
Finally, the obtained signal attribute feature data and LPC feature data of the target are inputted into the behavior identification model to obtain the behavior information of the target. The behavior information may be running, walking, jumping, or the like. The behavior identification model can be a machine learning model obtained through training based on a large quantity of training samples.
In this embodiment of this application, radar is used to detect a target. Because an electromagnetic wave transmitted by the radar is less affected by factors such as light and weather, the radar detects the target more accurately such that the feature data of the target that is obtained through analysis based on the radar echo signal is more accurate. Then, the finally determined behavior information of the target is more accurate.
In a possible implementation, processing the time-frequency domain data to obtain signal attribute feature data and LPC feature data includes processing the time-frequency domain data to obtain the signal attribute feature data, and inputting the time-frequency domain data into an LPC function to obtain the LPC feature data.
According to the solution shown in this embodiment of this application, the radar system can process the time-frequency domain data to obtain the signal attribute feature data. Then, the LPC feature data can be obtained using the LPC function. The LPC function may be as follows:
In a possible implementation, before processing the time-frequency domain data to obtain signal attribute feature data and LPC feature data, the method further includes performing dimension reduction on the time-frequency domain data.
According to the solution shown in this embodiment of this application, dimension reduction may be performed on the time-frequency domain data of the target, and then the feature data can be calculated based on the dimension-reduced time-frequency domain data. In this way, the amount of time-frequency domain data can be reduced, the calculation of the feature data can be accelerated, and the efficiency of identifying behavior of a target can be improved.
In a possible implementation, performing dimension reduction on the time-frequency domain data of each target includes performing dimension reduction on the time-frequency domain data of each target based on a principal component analysis (PCA) algorithm.
In the solution shown in this embodiment of this application, dimension reduction is performed on the time-frequency domain data based on the PCA algorithm.
In a possible implementation, the signal attribute feature data includes one or more of a maximum frequency value, a mean value of amplitude, a standard deviation of amplitude value, a mean absolute error of amplitude value, an amplitude value quartile, an amplitude value interquartile range, a spectral entropy, amplitude value skewness, and amplitude value kurtosis.
In the solution shown in this embodiment of this application, the maximum frequency value is a maximum Doppler frequency value in a time-frequency spectrogram corresponding to the time frequency domain data. The mean value of amplitude is an average value of all amplitude values corresponding to the time-frequency domain data. The standard deviation of amplitude value is a standard deviation of all the amplitude values corresponding to the time-frequency domain data. The average absolute error of amplitude value is an average absolute error of all the amplitude values corresponding to the time-frequency domain data. The amplitude value quartile means that the amplitude values corresponding to the time-frequency domain data are arranged from small to large and divided into four equal parts, and the amplitude values at the three points of division are respectively referred to as a 25% quartile, a 50% quartile, and a 75% quartile. The amplitude value interquartile range refers to a difference between the 75% quartile and the 25% quartile. The amplitude value skewness is a measure of a direction and degree of skewness of amplitude value distribution, and is a digital feature of a degree of asymmetry of the amplitude value distribution. The amplitude value kurtosis is a digital feature reflecting peak values at positions of average values of a probability density distribution curve corresponding to amplitude values. The spectral entropy represents a relationship between a power spectrum corresponding to time-frequency domain data and an entropy rate.
In a possible implementation, the behavior identification model is a support-vector machines (SVM) classifier model.
In the solution shown in this embodiment of this application, the behavior identification model may be an SVM classification model. During classification of behavior information, the SVM classification model can be used to classify the behavior information into only two types. To finally obtain a combination of a plurality of types of behavior information of a target, a plurality of SVM classification models can be combined for use. That is, a first SVM can classify the behavior information of the target into two types first, and a second SVM can classify the two types of behavior information to obtain subtypes corresponding to each type, and so on in order to obtain a combination of a plurality of types of behavior information. It can be learned that a combination of a plurality of types of behavior information of a target can be obtained by combining a plurality of SVM classification models, that is, the obtained behavior information of the target is more comprehensive.
According to a second aspect, an apparatus for identifying behavior of a target is provided, where the apparatus includes a receiving module configured to receive a radar echo signal reflected by a target, a processing module configured to process the radar echo signal to obtain time-frequency domain data, and process the time-frequency domain data to obtain signal attribute feature data and LPC feature data, where the signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal, and an identification module configured to input the signal attribute feature data and the LPC feature data into a behavior identification model, and output behavior information of the target.
In a possible implementation, the processing module is configured to process the time-frequency domain data to obtain the signal attribute feature data, and input the time-frequency domain data into an LPC function to obtain the LPC feature data.
In a possible implementation, the processing module is further configured to perform dimension reduction on the time-frequency domain data.
In a possible implementation, the processing module is configured to perform dimension reduction on the time-frequency domain data of each target based on a PCA algorithm.
In a possible implementation, the signal attribute feature data includes one or more of a maximum frequency value, a mean value of amplitude, a standard deviation of amplitude value, a mean absolute error of amplitude value, an amplitude value quartile, an amplitude value interquartile range, a spectral entropy, amplitude value skewness, and amplitude value kurtosis.
In a possible implementation, the behavior identification model is a SVM classifier model.
According to a third aspect, a radar system is provided, where the radar system includes a signal transmitting apparatus, a signal receiving apparatus, and a signal processing apparatus. The signal transmitting apparatus is configured to transmit a radar signal. The signal receiving apparatus is configured to receive a radar echo signal reflected back by a target when the radar signal contacts the target. The signal processing apparatus is configured to process the radar echo signal received by the signal receiving apparatus, to obtain time-frequency domain data, process the time-frequency domain data to obtain signal attribute feature data and LPC feature data, where the signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal, and input the signal attribute feature data and the LPC feature data into a behavior identification model, and output behavior information of the target.
According to a fourth aspect, a computer-readable storage medium is provided, where the computer-readable storage medium includes an instruction, and when the computer-readable storage medium runs on a computer, the computer is enabled to perform the method for identifying behavior of a target according to the first aspect.
The technical solutions provided in the embodiments of this application bring the following beneficial effects.
The radar echo signal reflected by the target is received, the radar echo signal is processed to obtain the signal attribute feature data and the LPC feature data of the target, and then the signal attribute feature data and the LPC feature data of the target are input into the behavior identification model to obtain the behavior information of the target. Because the radar signal transmitted by the radar is less affected by light, weather, and the like, a plurality of types of feature data of the target that are obtained based on the radar echo signal are more accurate, and further, the finally obtained behavior information is more accurate.
An embodiment of this application provides a method for identifying behavior of a target. The method can be implemented by a radar system and is applied in scenarios such as automated driving, intelligent human-computer interaction, and intelligent traffic early warning. Using automated driving as an example, the radar system may be a vehicle-mounted radar system, and the radar system can identify behavior of a pedestrian in front of a vehicle to determine whether there is a danger, and whether an emergency braking or deceleration process should be performed. For example, when a pedestrian crosses a guardrail in front of the vehicle and approaches the vehicle, the vehicle may immediately perform the emergency braking process to avoid collision.
The foregoing radar system may be an FMCW radar system, and the radar signal transmitted by the FMCW radar system in this embodiment of this application may be a sawtooth wave, a triangular wave, a trapezoidal wave, or the like. The radar system transmits a radar signal to the outside. After contacting a target, the radar signal is reflected back by the target, and is received by the radar system. Generally, the radar signal reflected by the target may be referred to as a radar echo signal. After receiving the radar echo signal, the radar system may analyze the radar echo signal to extract feature data of the radar echo signal, and then determine current behavior information of the target, such as running, walking, or crossing, based on the feature data.
The FMCW radar system may include a signal transmitting apparatus, a signal receiving apparatus, and a signal processing apparatus. The signal transmitting apparatus may include a transmit antenna 110 and an FMCW waveform generation unit 160. The signal receiving apparatus may include a receive antenna 120. The signal processing apparatus may include a mixer 130, a low-pass filter 140, an A/D converter 150, and a signal processing unit 170. The transmit antenna 110 is configured to transmit a radar signal. The receive antenna 120 is configured to receive a radar echo signal. The mixer 130 is configured to mix the received radar echo signal and the transmitted radar signal to obtain a beat signal, where the beat signal may also be referred to as a differential frequency signal or an intermediate frequency signal. The low-pass filter 140 is configured to filter out unwanted high-frequency signals from the mixed beat signals. The A/D converter 150 is configured to convert an analog signal of an electromagnetic wave into a digital signal for subsequent processing. The FMCW waveform generation unit 160 is configured to generate a to-be-transmitted radar signal, and the FMCW waveform generation unit 160 may include an FMCW waveform generator and an oscillator. The signal processing unit 170 may include a processor and a memory, where the processor is configured to perform feature extraction on the beat signal to obtain feature data of a target, and obtain behavior information of the target based on the feature data, and the memory is configured to store intermediate data, result data, and the like generated during processing of the beat signal for subsequent viewing.
An embodiment of this application provides a method for identifying behavior of a target. As shown in
Step 201: Receive a radar echo signal reflected by a target.
During implementation, a radar system may transmit radar signals based on a pulse repetition period, that is, a transmit antenna of the radar system continuously transmits modulated radar signals at different frequencies in each pulse repetition period, where the pulse repetition period may also be referred to as a frequency sweep period. The frequency variation pattern of the transmitted radar signal may also vary according to a requirement.
Step 202: Process the radar echo signal to obtain time-frequency domain data.
During implementation, the radar system may use a mixer to mix the received radar echo signal and the radar signal to be transmitted when the radar echo signal is received, to obtain a beat signal corresponding to the radar echo signal.
In a scenario in which behavior of a single target needs to be identified, for each pulse repetition period, M-point FFT may be performed on the beat signal after the A/D conversion, that is, frequency domain data of M points may be obtained, and the frequency domain data obtained at each point may be expressed in a form of a complex number, for example, ai+b. For the obtained frequency domain data of the M points, if an amplitude of a signal in a spectrum diagram corresponding to the frequency domain data is greater than a predetermined value, it is considered that the frequency domain data of the point is obtained from the radar echo signal reflected by the target. Then, the frequency domain data of the point may be used as the frequency domain data of the target.
In addition, beat signals corresponding to A pulse repetition periods may be used as a beat signal of a frame, and beat signals of B frames may be used as a beat signal of a behavior identification period. Then, for the frequency domain data of the target in one behavior identification period, the frequency domain data of the target obtained from each pulse repetition period in the behavior identification period can be accumulated to obtain the frequency domain data Si of the target in the behavior identification period, where Si is a one-dimensional vector of 1×AB, and each element in the vector represents a complex number of the frequency domain data of a point.
Then, time-frequency domain analysis may be performed on the frequency domain data Si of the target in one behavior identification period to obtain corresponding time-frequency domain data. The time-frequency domain analysis may be performing an STFT on the frequency domain data of the target in one behavior identification period, that is, a predetermined window function and the frequency domain data are multiplied, and then a W-point FFT is performed on the frequency domain data such that the time-frequency domain data Qi corresponding to the frequency domain data of the target in one behavior identification period can be obtained. Because W-point FFT can be performed on the frequency domain data with a size of AB for C times, Qi is a two-dimensional matrix of W×C, and each element in the matrix represents a complex number of time-frequency domain data of a point. The value of C is determined by a length of a sliding window, and C≤AB/W.
In a scenario in which behavior of a plurality of targets needs to be identified. For each pulse repetition period, M-point FFT is performed on the beat signal after A/D conversion to obtain M-point frequency domain data, and a plurality of signal amplitudes in the spectrum diagram corresponding to the M-point frequency domain data may be greater than a predetermined value such that it is considered that the M-point frequency domain data is obtained from radar echo signals reflected by different targets. Then, for a pulse repetition period, the target may appear in different pulse repetition periods, and therefore, the target matching algorithm may be used for the frequency domain data in two different pulse repetition periods to determine the frequency domain data of the same target. An example of common target matching algorithms is a Kalman filtering method. After determining the frequency domain data of the same target in one behavior identification period, the time-frequency domain analysis performed on the frequency domain data of the target in the scenario in which behavior of a single target needs to be identified may be performed on the frequency domain data of each target.
Step 203: Process the time-frequency domain data to obtain signal attribute feature data and LPC feature data.
The signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal.
During implementation, corresponding signal attribute feature data and LPC feature data can be determined for the time-frequency domain data of each target. Determining of the signal attribute feature data and the LPC feature data is described below.
For determining of the frequency feature data, the frequency feature data may include one or more of a maximum frequency value, a mean value of amplitude, a standard deviation of amplitude value, a mean absolute error of amplitude value, an amplitude value quartile, an amplitude value interquartile range, a spectral entropy, amplitude value skewness, and amplitude value kurtosis in the frequency domain data. Before the frequency feature data is calculated, the time-frequency domain data Qi may be converted into a one-dimensional row vector Ri, and then amplitude values of the converted time-frequency domain data are obtained, that is, a modulo operation is performed on each element in Ri is obtained, and the modulo operation formula may be as follows:
∥Ri∥=abs(Ri).
A method for calculating each piece of frequency feature data is described below.
(1) For the maximum frequency value, that is, a maximum Doppler frequency value in the time-frequency spectrogram corresponding to the time frequency domain data, the calculation formula may be as follows:
FTi,1=fi,max=max(fd,n),
Other pieces of signal attribute feature data other than the maximum frequency value may be calculated based on the time-frequency domain data ∥Ri∥ that is obtained after the modulo operation is performed and a calculation method is described below.
(2) For the mean value of amplitude, the calculation formula may be as follows:
(3) For the standard deviation of amplitude value, the calculation formula may be as follows:
(4) For the mean absolute error of amplitude value, the calculation formula may be as follows:
(5) The amplitude value quartile means that amplitude values are arranged from small to large and divided into four equal parts, and the amplitude values at the three points of division are the amplitude value quartiles. ∥Ri∥ is arranged from small to large based on sort(∥Ri∥), and the positions used to divide ∥Ri∥ into four equal parts are denoted as a, b, and c, which may be referred to as a 25% quartile, a 50% quartile, and a 75% quartile, respectively. A formula for calculating the 25% quartile may be as follows:
FTi,5=Quai=sort(∥Ri∥)a,
(6) The amplitude value interquartile range indicates a difference between the 75% quartile c and the 25% quartile a, and the calculation formula may be as follows:
FTi,6=IQRi=sort(∥Ri∥)c−Quai,
(7) The amplitude value skewness is a measure of a direction and degree of skewness of amplitude value distribution, and is a digital feature of a degree of asymmetry of the amplitude value distribution, and the calculation formula may be as follows:
(8) The amplitude value kurtosis is a digital feature reflecting peak values at positions of average values of a probability density distribution curve corresponding to amplitude values, and the calculation formula may be as follows:
(9) The spectral entropy represents a relationship between a power spectrum corresponding to time-frequency domain data and an entropy rate, and the calculation formula may be expressed as follows:
The LPC feature data corresponding to the time-frequency domain data can be obtained based on the LPC function. The LPC function may be as follows:
A secondary prediction error E can be further obtained through calculation, and the calculation formula may be as follows:
E=ΣWe2(n)=ΣWw[x(n)−{circumflex over (x)}(n)]2,
In each of the foregoing formulas, the value of P can be determined based on an actual requirement. To ensure that the final behavior identification is more accurate, a relatively large value of P can be used, for example, P=6. Then, for the formula used for calculating the secondary error E, a1 to a6 that minimize E can be calculated, that is, six LPCs are the LPC feature data corresponding to the time domain data, and the six LPCs can be represented as FTi,10 to FTi,15.
Therefore, for a target, the feature data of the signal of the target may be expressed as FTi=[FTi,1, FTi,2, . . . FTi,15].
In a possible implementation, to reduce the amount of data during feature extraction, correspondingly, dimension reduction may be performed on the time-frequency domain data before the time-frequency domain data is processed.
During implementation, based on the processing in step 202, it can be learned that the time-frequency domain data Qi of the target is a two-dimensional matrix of W×C, then, dimension reduction may be performed on each piece of Qi to reduce the amount of data. The dimension reduction processing may be performed based on a PCA dimension reduction algorithm, a singular value decomposition algorithm, a manifold learning algorithm, or the like. The dimension reduction based on the PCA algorithm is described below.
For the time-frequency domain data Qi of the target, q(l) is a one-dimensional vector of the lth row in, where l=1, 2 . . . W, the value of q(l) is 1×C, and the covariance matrix may be calculated based on the following calculation formula:
It should be noted that the processing performed on the time-frequency domain data of the target in step 203 may be performed on the dimension-reduced time-frequency domain data Ti of the target.
Step 204: Input the signal attribute feature data and the LPC feature data into a behavior identification model, and output behavior information of the target.
The behavior identification model may be an SVM classification model, or may be a neural network model. For ease of description, the signal attribute feature data and the LPC feature data of the target are collectively referred to as the feature data of the target.
During implementation, the feature data that is of the target and that is obtained through the foregoing processing may be input into a pre-trained behavior identification model to obtain the behavior information of the target, for example, walking, running, moving away, approaching, crossing a load, or diagonally crossing a road. The following description is based on an example in which the behavior identification model is an SVM classification model.
During classification of behavior information, the SVM classification model can be used to classify the behavior information into only two types. To finally obtain a combination of a plurality of types of behavior information of a target, a plurality of SVM classification models can be combined for use. That is, a first SVM can classify the behavior information of the target into two types first, and a second SVM can classify the two types of behavior information to obtain subtypes corresponding to each type, and so on in order to obtain a combination of a plurality of types of behavior information.
It should be noted herein that only a few examples of the behavior information obtained based on the behavior identification model are listed above, and because the behavior identification model can be trained using different samples, different behavior information may be identified based on the behavior identification model. In addition, it should be noted that the processing procedure in step 201 and step 203 may be used as a method for obtaining feature data of a target in a process of identifying behavior of the target in actual application, or may be used as a method for obtaining feature data of a sample in training samples. When the processing procedure in step 201 and step 203 is used as a method for obtaining the feature data of a sample in training samples, after the feature data of a target in a behavior identification period is obtained, X pieces of feature data in X behavior identification periods can be continuously obtained and used as a sample feature dataset to train the behavior identification model. To improve accuracy of the trained behavior identification model, X can be a relatively large value, for example, tens of thousands, hundreds of thousands, or even a larger value.
Based on the same technical concept, an embodiment of this application further provides a behavior identification apparatus. As shown in
The receiving module 610 is configured to receive a radar echo signal reflected by a target. Further, the receiving module 610 can implement the function of receiving the radar echo signal in step 201, and other implicit steps.
The processing module 620 is configured to process the radar echo signal to obtain time-frequency domain data, and process the time-frequency domain data to obtain signal attribute feature data and LPC feature data, where the signal attribute feature data is used to represent a feature of the radar echo signal attribute, and the LPC feature data is used to represent a feature of the radar echo signal. Further, the processing module 620 can implement the function of processing the radar echo signal in step 202, the function of processing the time-frequency domain data in step 203, and other implicit steps.
The identification module 630 is configured to input the signal attribute feature data and the LPC feature data into a behavior identification model, and output behavior information of the target. Further, the identification module 630 can implement the function of determining behavior information of the target in step 204, and other implicit steps.
In a possible implementation, the processing module 620 is configured to process the time-frequency domain data to obtain the signal attribute feature data, and input the time-frequency domain data into an LPC function to obtain the LPC feature data.
In a possible implementation, the processing module 620 is further configured to perform dimension reduction on the time-frequency domain data.
In a possible implementation, the processing module 620 is configured to perform dimension reduction on the time-frequency domain data of each target based on a PCA algorithm.
In a possible implementation, the signal attribute feature data includes one or more of a maximum frequency value, a mean value of amplitude, a standard deviation of amplitude value, a mean absolute error of amplitude value, an amplitude value quartile, an amplitude value interquartile range, a spectral entropy, amplitude value skewness, and amplitude value kurtosis.
In a possible implementation, the behavior identification model is an SVM SVM classifier model.
It may be noted that when the apparatus for identifying behavior of a target provided in the foregoing embodiments are identifying behavior of the target, division of the foregoing function modules is taken only as an example for illustration. In actual application, the foregoing functions can be allocated to different function modules and implemented according to a requirement, that is, an inner structure of the radar system is divided into different function modules to implement all or part of the functions described above. In addition, the apparatus for identifying behavior of a target provided in the foregoing embodiment and the embodiment of the method for identifying behavior of a target belong to the same concept. For a detailed implementation process of the apparatus, reference may be made to the method embodiment, and details are not described herein again.
All or some of the foregoing embodiments may be implemented by software, hardware, firmware, or any combination thereof. When software is used to implement the embodiments, the embodiments may be implemented completely or partially in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the procedure or functions according to the foregoing embodiments of this application are all or partially generated. The computer instructions may be stored in a computer-readable storage medium. The computer-readable storage medium may be any usable medium accessible by a device, or a data storage device integrating one or more usable media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape, or the like), an optical medium (for example, a digital versatile disc (DVD), or the like), a semiconductor medium (for example, a solid-state drive, or the like).
A person of ordinary skill in the art may understand that all or some of the steps of the embodiments may be implemented by hardware or a program instructing related hardware. The program may be stored in a computer-readable storage medium. The storage medium may include a read-only memory (ROM), a magnetic disk, or an optical disc.
The foregoing descriptions are merely specific embodiments of this application, but are not intended to limit this application. Any modification, equivalent replacement, or improvement made without departing from the spirit and principle of this application should fall within the protection scope of this application.
Number | Date | Country | Kind |
---|---|---|---|
201910817089.3 | Aug 2019 | CN | national |
This application is a continuation of International Patent Application No. PCT/CN2020/085139 filed on Apr. 16, 2020, which claims priority to Chinese Patent Application No. 201910817089.3 filed on Aug. 30, 2019. The disclosures of the aforementioned applications are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4954963 | Penz | Sep 1990 | A |
5638281 | Wang | Jun 1997 | A |
5999118 | Hethuin et al. | Dec 1999 | A |
7423581 | Fujikawa | Sep 2008 | B2 |
7924212 | Benitez | Apr 2011 | B2 |
8682821 | Benitez | Mar 2014 | B2 |
9520051 | Zack | Dec 2016 | B1 |
9568594 | Harash | Feb 2017 | B2 |
9568595 | Zack | Feb 2017 | B2 |
9576468 | Zack | Feb 2017 | B2 |
9733350 | Stainvas Olshansky | Aug 2017 | B2 |
10037671 | Zack | Jul 2018 | B2 |
10108903 | Piao | Oct 2018 | B1 |
10310073 | Santra | Jun 2019 | B1 |
10310087 | Laddha | Jun 2019 | B2 |
10365350 | Kamo | Jul 2019 | B2 |
10473756 | Nakayama | Nov 2019 | B2 |
10495725 | Zhang | Dec 2019 | B2 |
10576328 | Santra | Mar 2020 | B2 |
10621847 | Zack | Apr 2020 | B2 |
10838057 | Schuck | Nov 2020 | B2 |
10901069 | Otsuki | Jan 2021 | B2 |
10934764 | Rafrafi | Mar 2021 | B2 |
11023718 | Lin | Jun 2021 | B2 |
11067667 | Rafrafi | Jul 2021 | B2 |
11364931 | Lu | Jun 2022 | B2 |
11378673 | Schuck | Jul 2022 | B2 |
11391819 | Sarkis | Jul 2022 | B2 |
11531110 | Lu | Dec 2022 | B2 |
11594011 | Lu | Feb 2023 | B2 |
11651326 | Abebe | May 2023 | B2 |
11734472 | Dolan | Aug 2023 | B2 |
20060082493 | Fujikawa | Apr 2006 | A1 |
20110032139 | Benitez | Feb 2011 | A1 |
20110257536 | Ser et al. | Oct 2011 | A1 |
20120106298 | Liu | May 2012 | A1 |
20120249360 | Kanamoto | Oct 2012 | A1 |
20130041856 | Benitez | Feb 2013 | A1 |
20140266860 | Blumrosen | Sep 2014 | A1 |
20160003939 | Stainvas Olshansky | Jan 2016 | A1 |
20160223651 | Kamo | Aug 2016 | A1 |
20160377704 | Harash | Dec 2016 | A1 |
20160377705 | Zack | Dec 2016 | A1 |
20160379462 | Zack | Dec 2016 | A1 |
20160379474 | Zack | Dec 2016 | A1 |
20160379475 | Zack | Dec 2016 | A1 |
20170358301 | Raitio et al. | Dec 2017 | A1 |
20180011169 | Nakayama | Jan 2018 | A1 |
20180106889 | Schuck | Apr 2018 | A1 |
20180136326 | Schuck | May 2018 | A1 |
20180284223 | Otsuki | Oct 2018 | A1 |
20180330593 | Zack | Nov 2018 | A1 |
20180348374 | Laddha | Dec 2018 | A1 |
20190162010 | Rafrafi | May 2019 | A1 |
20190162821 | Rafrafi | May 2019 | A1 |
20190162822 | Rafrafi | May 2019 | A1 |
20190178980 | Zhang | Jun 2019 | A1 |
20190187261 | Peso Parada | Jun 2019 | A1 |
20190240535 | Santra | Aug 2019 | A1 |
20190295282 | Smolyanskiy | Sep 2019 | A1 |
20200025877 | Sarkis | Jan 2020 | A1 |
20200184027 | Dolan | Jun 2020 | A1 |
20200320286 | Lin | Oct 2020 | A1 |
20210263140 | Schuck | Aug 2021 | A1 |
20210311168 | Rafrafi | Oct 2021 | A1 |
20210354718 | Lu | Nov 2021 | A1 |
20210365712 | Lu | Nov 2021 | A1 |
20210373161 | Lu | Dec 2021 | A1 |
Number | Date | Country |
---|---|---|
106127110 | Nov 2016 | CN |
106250854 | Dec 2016 | CN |
107358250 | Nov 2017 | CN |
108388850 | Aug 2018 | CN |
108614993 | Oct 2018 | CN |
108920993 | Nov 2018 | CN |
109065070 | Dec 2018 | CN |
109188414 | Jan 2019 | CN |
109313898 | Feb 2019 | CN |
106814351 | Apr 2019 | CN |
109765539 | May 2019 | CN |
109917347 | Jun 2019 | CN |
H0464080 | Feb 1992 | JP |
2012163403 | Aug 2012 | JP |
2016070701 | May 2016 | JP |
2017026475 | Feb 2017 | JP |
Entry |
---|
Xu Wei et al.,“Pedestrian recognition algorithm based on vision and millimeter wave radar information fusion”, Journal of Tongji University (Natural Science), vol. 45, Suppl.1, Dec. 2017, with an English abstract, 7 pages. |
Gurbuz, S., et al., “Operational assessment and adaptive selection of micro-Doppler features,” IET Radar, Sonar & Navigation, vol. 9, No. 9, 2015, 9 pages. |
Javier, R., et al., “Application of Linear Predictive Coding for Human Activity Classification Based on Micro-Doppler Signatures,” IEEE Geoscience and Remote Sensing Letters, vol. 11, No. 10, Oct. 2014, pp. 1831-1834. |
Lee, C., et la., “Moving Object Classifier based on UWB Radar Signal,” 6 pages. |
Number | Date | Country | |
---|---|---|---|
20210190912 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/085139 | Apr 2020 | WO |
Child | 17188106 | US |