This application relates to the field of communication technologies, and specifically, to a target positioning sensing method and apparatus, a communication device, and a storage medium.
Main positioning technologies currently used in communication systems are new radio (NR) positioning technology and LONG TERM EVOLUTION (LTE) positioning technology. These positioning technologies can only locate devices that use these positioning technologies. For example, a terminal can determine a location of only the terminal through NR positioning technology. In this way, because only the device using the positioning technology can be located, the communication system has poor positioning capabilities.
Embodiments of this application provide a target positioning sensing method and apparatus, a communication device, and a storage medium.
According to a first aspect, a target positioning sensing method is provided, including:
According to a second aspect, a target positioning sensing method is provided, including:
According to a third aspect, a target positioning sensing apparatus is provided, including:
According to a fourth aspect, a target positioning sensing apparatus is provided, including:
According to a fifth aspect, a communication device is provided, including a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or the instructions, when executed by the processor, implement the steps of the target positioning sensing method on the first device side according to embodiments of this application.
According to a sixth aspect, a communication device is provided, including a processor and a communication interface, where the communication interface is configured to perform sensing measurement on a sensing target to obtain a first measurement quantity result of a first measurement quantity of a dynamic reflection path of a first signal, where the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and send the first measurement quantity result, where the first measurement quantity result is used for determining a positioning sensing result of the sensing target.
According to a seventh aspect, a communication device is provided, including a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or the instructions, when executed by the processor, implement the steps of the target positioning sensing method on the third device side according to embodiments of this application.
According to an eighth aspect, a communication device is provided, including a processor and a communication interface, where the communication interface is configured to receive first measurement quantity results sent by at least two first devices, where the first measurement quantity results are results of a first measurement quantity of a dynamic reflection path of a first signal that are obtained by the first devices by performing sensing measurement on a sensing target, the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and the processor or the communication interface is configured to determine a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices.
According to a ninth aspect, a communication system is provided, including: a first device and a third device. The first device may be configured to perform the steps of the target positioning sensing method according to the first aspect. The third device may be configured to perform the steps of the target positioning sensing method according to the second aspect.
According to a tenth aspect, a readable storage medium is provided, where the readable storage medium stores a program or instructions, and the program or the instructions, when executed by a processor, implement the steps of the target positioning sensing method on the first device side according to embodiments of this application, or implement the steps of the target positioning sensing method on the third device side according to embodiments of this application.
According to an eleventh aspect, a chip is provided, including a processor and a communication interface, where the communication interface and the processor are coupled, and the processor is configured to run a program or instructions to implement the target positioning sensing method on the first device side, or implement the target positioning sensing method on the third device side.
According to a twelfth aspect, a computer program product is provided, stored in a storage medium, where the computer program product is executed by at least one processor to implement the steps of the target positioning sensing method on the first device side, or the computer program product is executed by at least one processor to implement the steps of the target positioning sensing method on the third device side.
The technical solutions in embodiments of this application are clearly described below with reference to the accompanying drawings in embodiments of this application. Apparently, the described embodiments are some rather than all of embodiments of this application. All other embodiments obtained by a person of ordinary skill in the art based on embodiments of this application fall within the protection scope of this application.
The specification and claims of this application, and terms “first” and “second” are used to distinguish similar objects, but are not used to describe a specific sequence or order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that embodiments of this application are capable of being practiced in other sequences than those illustrated or described herein, and that the objects distinguished by “first” and “second” are generally a class and do not limit the number of objects, e.g., the first object may be one or more. In addition, “and/or” in the description and claims represents at least one of connected objects, and the character “/” generally indicates an “or” relationship between associated objects.
It should be noted that the technologies described in embodiments of this application are not limited to a long term evolution (LTE)/LTE-advanced (LTE-A) system, and may be further applied to other wireless communication systems such as Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Orthogonal Frequency Division Multiple Access (OFDMA), Single Carrier Frequency Division Multiple Access (SC-FDMA), and other systems. The terms “system” and “network” may be used interchangeably in embodiments of this application. The technology described can be applied to the systems and radio technologies described above, and can also be applied to other systems and radio technologies. The following description describes a new radio (NR) system for illustrative purposes, and NR terminology is used in most of the description below. These technologies are also applicable to applications other than NR system applications, for example, the 6th generation (6G) communication system.
The first device 11 may be configured to perform sensing measurement, and in particular may perform sensing measurement on a signal (for example, a dedicated sensing signal, an integrated sensing and communication signal, or an LTE or NR reference signal) sent by the second device 12 and send a measured measurement quantity result to the third device.
The second device 12 may be configured to send a signal (for example, a dedicated sensing signal, an integrated sensing and communication signal, or an LTE or NR reference signal) such that the first device 11 performs sensing measurement and the second device 12 may not perform measurement.
The third device 13 may be configured to determine a positioning sensing result of a sensing target according to measurement quantity results sent by the at least two first devices 11.
In this embodiment, the first device 11 may be a terminal, a network side device, or a dedicated sensing device, the second device 12 may be a terminal, a network side device, or a dedicated sensing device, and the third device 13 may be a terminal, a network side device, a dedicated sensing device, a third party service, or the like.
In this embodiment of this application, the terminal may be a terminal side device such as a mobile phone, a tablet personal computer, a laptop computer or referred to as a notebook computer, a personal digital assistant (PDA), a palmtop computer, a netbook, an ultra-mobile personal computer (UMPC), a mobile Internet device (MID), an augmented reality (AR)/virtual reality (VR) device, a robot, a wearable device, vehicle user equipment (VUE), pedestrian user equipment (PUE), smart home (home equipment with wireless communication functions, such as a refrigerator, a TV, a washing machine, or furniture), a game console, a personal computer (PC), a teller machine, or a self-service machine. The wearable device includes: a smart watch, a smart band, smart headphones, smart glasses, smart jewelry (a smart bracelet, a smart bracelet, a smart ring, a smart necklace, a smart anklet, a smart anklet, and the like), a smart wristband, smart clothing, and the like. It should be noted that in embodiments of this application, a specific type of the terminal is not limited.
The network side device may include an access network device or a core network device, where the access network device may also be referred to as a radio access network device, a radio access network (RAN), a radio access network function, or a radio access network unit. The access network device may include a base station, a small base station, a wireless local area network (WLAN) access point, a Wireless Fidelity (WiFi) node, or the like. The base station may be referred to as a Node B, an evolved Node B (eNB), an access point, a base transceiver station (BTS), a radio base station, a radio transceiver, a basic service set (BSS), an extended service set (ESS), a Home Node B, a Home evolved Node B, a transmitting receiving point (TRP), or another suitable term in the field. As long as the same technical effect is achieved, the base station is not limited to a specific technical vocabulary. It should be noted that, only a base station in an NR system is used as an example in embodiments of this application, and the specific type of the base station is not limited. The core network device may include, but is not limited to, at least one of a core network node, a core network function, a mobility management entity (MME), an access and mobility management function (AMF), a session management function (SMF), a user plane function (UPF), a policy control function (PCF), a policy and charging rules function (PCRF), an edge application server discovery function (EASDF), unified data management (UDM), unified data repository (UDR), a home subscriber server (HSS), centralized network configuration (CNC), a network repository function (NRF), a network exposure function (NEF), local NEF (Local NEF, or L-NEF), a binding support function (BSF), an application function (AF), and the like. It should be noted that this embodiment of this application is described only by taking a core network device in an NR system as an example, and a specific type of the core network device is not limited.
A target positioning sensing method and apparatus, a communication device, and a storage medium according to embodiments of this application are described in detail below by using some embodiments and application scenarios thereof with reference to the accompanying drawings.
Refer to
Step 201: A first device performs sensing measurement on a sensing target, to obtain a first measurement quantity result of a first measurement quantity of a dynamic reflection path of a first signal, where the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device.
The first device may be a terminal, a network side device, or a dedicated sensing device.
In an implementation, the first device includes: a terminal, a network side device, or a dedicated sensing device for performing sensing measurement.
The second device may be a terminal, a network side device, or a dedicated sensing device.
In an implementation, the second device includes a terminal, a network side device, or a dedicated sensing device for sending the first signal.
The sensing target may be a target object such as a terminal, a vehicle, a person, or an animal.
The sensing measurement may be to measure the first measurement quantity of the dynamic reflection path of the first signal to obtain the first measurement quantity result.
The dynamic reflection path may be a dynamic reflection path caused by the sensing target for a wireless channel from the second device to the first device. The dynamic reflection path may specifically be a multipath signal from the first signal being reflected by the sensing target and received by the first device.
The first measurement quantity result may include at least one of a reflection path Doppler frequency or a reflection path length change speed. The reflection path Doppler frequency may represent a length change speed of the sensing target. For example, because the sensing target is in motion, the Doppler frequency is introduced into the wireless channel from the second device to the first device, and the length of the reflection path of the sensing target is to change. The length change speed is the Doppler frequency of the reflection path in propagation directions (incident and emitting directions relative to the human body) thereof.
In an implementation, the first signal includes one of:
The dedicated sensing signal may be a signal dedicated for sensing measurement of a target, the integrated sensing and communication signal may be understood as an integrated sensing and communication (ISAC) signal, and the LTE or NR reference signal may be a reference signal such as a positioning reference signal (PRS) or a sounding reference signal (SRS).
In addition, the first signal may be a signal sent by a terminal to another terminal, may be a signal sent by a network side device to a terminal, or may be a signal sent by a terminal to a network side device.
Step 202: The first device sends the first measurement quantity result, where the first measurement quantity result is used for determining a positioning sensing result of the sensing target.
That the first device sends the first measurement quantity result may be that the first device directly reports the first measurement quantity result to the third device, or that the first device reports the first measurement quantity result to the third device through the second device, or that one first device reports the first measurement quantity result to the third device through the other first device.
That the first measurement quantity result is used for determining a positioning sensing result of the sensing target may be that after the third device receives the first measurement quantity result, the positioning sensing result of the sensing target may be determined based on the first measurement quantity result.
In an implementation, the third device may determine the positioning sensing result of the sensing target based on first measurement quantity results reported by at least two first devices. Specifically, according to the first measurement quantity result reported by each first device, a dynamic reflection path of the sensing target to the first device may be determined, so that intersections between dynamic reflection paths corresponding to at least two first devices can be determined, and then a location of the sensing target can be determined based on such intersections. For example, an intersection between dynamic reflection paths corresponding to two first devices is a current location of the sensing target.
In this embodiment of this application, the first measurement quantity result sent by the first device can be used to implement the sensing positioning on the sensing target, thereby improving positioning capabilities of the communication system. For example, a location of a target such as another terminal, a pedestrian, or a vehicle can be determined by the first device.
In an optional implementation, the positioning sensing result includes at least one of:
The speed of the sensing target may be determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results and locations of the first device and the second device.
The speed direction of the sensing target may be determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results and locations of the first device and the second device.
The trajectory of the sensing target may be determined based on locations of the sensing target at a plurality of measurement moments, where the locations of the sensing target are determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results.
The future predicted location of the sensing target may be a location predicted at next one or more moments according to a current location of the sensing target. For example, assuming that two adjacent times of measurement are sufficiently short in time (for example, 5 to 10 ms) relative to pedestrian motion, the pedestrian can be approximated as moving in a straight line at a uniform speed in this period, so that a location of the pedestrian at the next moment can be predicted.
In an optional implementation, that the first device sends the first measurement quantity result includes:
The first timestamp is a timestamp at which the first measurement quantity result is recorded by the first device, so that the measurement time corresponding to each first measurement quantity result can be accurately represented by the first measurement quantity result and the first timestamp, and then the third device can accurately determine the trajectory of the sensing target.
The first serial number may be a measurement serial number of the first measurement quantity results, so that the measurement time corresponding to each first measurement quantity result can be accurately represented by the first measurement quantity result and the first serial number, and then the third device can accurately determine the trajectory of the sensing target.
In an optional implementation, that the first device sends the first measurement quantity result includes:
The plurality of first measurement quantity results may be measurement quantity results measured at a plurality of measurement moments, so that the third device can accurately determine the trajectory of the sensing target based on the plurality of first measurement quantity results.
In an optional implementation, the method further includes:
The angle of arrival APS may be denoted as angle of arrival (AOA) APS, namely, AOA APS, and the angle of departure APS may be denoted as angle of departure (AOD) APS, namely, AOD APS.
The second measurement quantity result of the second measurement quantity can be obtained according to an angle measurement method in an NR positioning technology and NR beam management ideas, or can be implemented by using an algorithm of the first device. For example, the angle power spectrum can be obtained through Fast Fourier Transform (FFT) (including zero-padded FFT), a commonly used spatial filter (for example, a Bartlett Beamformer), minimum variance distortionless response (MVDR), multiple signal classification (MUSIC), and refinement algorithms thereof. In addition, in this embodiment of this application, dynamic reflection path recognition can be implemented by Doppler spectrum estimation combined with pattern recognition or machine learning. This is not specifically limited.
The sending of the second measurement quantity result by the first device may be directly or indirectly to the third device. After receiving the second measurement quantity result, the third device determines at least one of the initial location or the trajectory of the sensing target based on the second measurement quantity result.
In this implementation, the positioning capabilities of the communication system can be further improved through the second measurement quantity result.
Optionally, the performing, by the first device, APS measurement on a wireless channel in which the sensing target is located, to obtain a second measurement quantity result of a second measurement quantity includes at least one of:
The fixed downlink beam may be a downlink beam of the network side device corresponding to a maximum downlink-positioning reference signal (DL-PRS) RSRP measured by the terminal, and the fixed uplink beam may be an uplink beam of the terminal corresponding to a maximum uplink-sounding reference signal (UL-SRS) RSRP measured by the network side device.
Optionally, the method further includes:
The suppressing interference energy other than a dynamic reflection path spectrum peak in the second measurement quantity result may be to identify and track the spectrum peak corresponding to the dynamic reflection path based on the plurality of measurement quantity results by detecting spectrum peak power fluctuations, using another pattern recognition method, or using a machine learning method, and then suppress the interference energy other than the dynamic reflection path spectrum peak in a subsequent measurement quantity result to obtain the second measurement quantity result with suppressed interference energy.
In this implementation, because the suppressed second measurement quantity result is reported, the accuracy of the sensing positioning can be improved.
Optionally, the sending, by the first device, the second measurement quantity result includes:
In this implementation, the measurement time corresponding to each second measurement quantity result can be accurately represented by the second measurement quantity result and the second timestamp, so that the third device can accurately determine at least one of the initial location and the trajectory of the sensing target. Moreover, the measurement time corresponding to each second measurement quantity result can be accurately represented by the second measurement quantity result and the second serial number, so that the third device can accurately determine at least one of the initial location and the trajectory of the sensing target.
Optionally, the second measurement quantity includes one of:
The channel angle of arrival APS refers to an angle of arrival APS of the entire channel, and the channel angle of departure APS refers to an angle of departure APS of the entire channel. The within the target angle range may be identified from historical APS measurement results and dynamic reflection path spectrum peaks. In this way, only measurement quantity result of the angle of arrival APS or the angle of departure APS within the target angle range needs to be reported, to reduce reporting overheads.
Optionally, the method further includes:
The third device may include one of:
The angle information may be the angle of arrival or the angle of departure of the sensing target to each first device that is determined based on location coordinates of the sensing target that are determined by the third device based on a Doppler frequency measurement result and a trajectory initial location estimation result of the sensing target.
In this implementation, the APS measurement of the angle of arrival or the angle of departure is performed on the sensing target, to reduce a measurement calculation amount at the terminal, and reduce reporting overheads.
In an optional implementation, the method further includes:
The parameter configuration information may be sent by the third device to the first device, and in some implementations, the parameter configuration information may alternatively be protocol defined or pre-configured.
Optionally, the parameter configuration information includes at least one of:
The waveform may include at least one of:
The intra-Burst signal time interval refers to a time interval of sensing signals/integrated sensing and communication signals/reference signals within one burst, and the inter-Burst time interval refers to a time interval of adjacent bursts before and after when a plurality of bursts need to be sent.
The signal format may be synchronization signal block (SSB), PRS, demodulation reference signal (DMRS), or scheduling request (SR).
The signal direction may be a direction of the first signal or beam information.
The time resource may be index information of a time resource of the first signal, for example, an index of a slot where the first signal is located or a symbol index of the slot.
The frequency resource may be a center frequency, a bandwidth, a resource block (RB), a subcarrier, Point A, a starting bandwidth location of the first signal, or the like, where Point A is a reference point of the frequency resource.
The QCL relationship may be a QCL relationship between a resource of the first signal and an SSB. For example, the first signal includes a plurality of resources, each of which is QCL to one SSB. In this embodiment of this application, the QCL relationship may include a protocol-defined Type A, Type B, Type C, or Type D QCL relationship.
In an optional implementation, the method further includes:
The reporting, by the first device, device information of the first device to a third device may include at least one of:
The device information may include at least one of:
The status information may include at least one of:
The sensing capability information may include at least one of:
In this implementation, through the device information, the third device can flexibly select the device participating in the collaborative sensing according to actual requirements and situations of the sensing target.
In embodiments of this application, the first device performs sensing measurement on a sensing target to obtain a first measurement quantity result of a first measurement quantity of a dynamic reflection path of a first signal, where the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and the first device sends the first measurement quantity result, where the first measurement quantity result is used for determining a positioning sensing result of the sensing target. In this way, by using the first measurement quantity result sent by the first device, sensing positioning of the sensing target can be implemented, thereby improving positioning capabilities of the communication system.
Refer to
Step 301: A third device receives first measurement quantity results sent by at least two first devices, where the first measurement quantity results are results of a first measurement quantity of a dynamic reflection path of a first signal that are obtained by the first devices by performing sensing measurement on a sensing target, the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device.
Step 302: The third device determines a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices.
According to the first measurement quantity result reported by each first device, the third device can determine a dynamic reflection path of the sensing target to the first device, so that intersections between dynamic reflection paths corresponding to at least two first devices can be determined, and then a location of the sensing target can be determined based on such intersections. For example, an intersection between dynamic reflection paths corresponding to two first devices is a current location of the sensing target.
Optionally, the first device includes:
The second device includes:
Optionally, the positioning sensing result includes at least one of:
The speed of the sensing target may be determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results and locations of the first device and the second device.
The speed direction of the sensing target may be determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results and locations of the first device and the second device.
The trajectory of the sensing target may be determined based on locations of the sensing target at a plurality of measurement moments, where the locations of the sensing target are determined based on the reflection path Doppler frequency or the reflection path length change speed in the at least two first measurement quantity results.
The future predicted location of the sensing target may be a location predicted at next one or more moments according to a current location of the sensing target. For example, assuming that two adjacent times of measurement are sufficiently short in time (for example, 5 to 10 ms) relative to pedestrian motion, the pedestrian can be approximated as moving in a straight line at a uniform speed in this period, so that a location of the pedestrian at the next moment can be predicted.
Optionally, the first signal includes one of:
Optionally, that a third device receives first measurement quantity results sent by at least two first devices includes:
That the third device determines a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices includes:
Optionally, that a third device receives first measurement quantity results sent by at least two first devices includes:
Optionally, the method further includes:
The initial location may be a pre-estimated location, or an initial location assumed from prior information.
In an implementation, the initial location of the sensing target includes:
The initial location of the sensing target determined based on the terminal positioning technology may be an initial location of the sensing target determined based on a positioning system (for example, a global positioning system (GPS)), or an initial location of the sensing target determined based on an LTE or NR positioning technology, or an initial location of the sensing target determined through Bluetooth and Ultra Wide Band (UWB).
Optionally, the initial location of the sensing target determined based on the device-free technology includes:
The initial location of the sensing target determined based on echolocation may be obtained by the network side device or the terminal by locating the target based on an echo by spontaneously self-collecting sensing signals. For example, at a trajectory sensing service start moment, the network side device emits a sensing signal to the sensing target and receives an echo, and performs distance measurement and angle measurement on the pedestrian, to obtain an initial location of a pedestrian motion trajectory.
The initial location of the sensing target determined based on the angle information of the sensing target may be an initial location of the sensing target estimated based on a reflection path angle of arrival or departure of the first device to the sensing target.
In an implementation, for a first device with a high sensing capability, the device is capable of sensing in a spontaneous self-collection manner, with a large quantity of antennas and bandwidth and high sensing resolution. In this way, initial positioning on the sensing target can be achieved by using angle information measured by the single first device.
In another implementation, the initial location of the sensing target determined based on the angle information of the sensing target is determined in the following manner:
The initial location search area may be a search range of the sensing target determined according to prior information, for example, the initial location search area is determined based on positioning information of LTE or NR, historical trajectory information, environmental map information, and the like. The positioning information of LTE or NR may be used prior to trajectory sensing, an approximate location of the target is known for the LTE or NR positioning method. The historical trajectory information may be trajectory information of daily behavior habits of the sensing target, and the environmental map information may be an area in which obstacles exist.
The angle information of the first device to the sensing target may be the second measurement quantity result of the second measurement quantity obtained by the first device by performing APS measurement on the wireless channel in which the sensing target is located, where the second measurement quantity result includes the measured angle of arrival APS and/or angle of departure APS.
In this implementation, because the initial location of the sensing target is determined in the initial location search area, the search range of the sensing target can be narrowed, reducing the calculation amount, and improving the sensing efficiency.
Optionally, the calculating, by the third device, a confidence of each candidate location in a plurality of candidate locations in an initial location search area includes:
The determining an estimated motion trajectory of the sensing target according to the speed and the speed direction of the sensing target may be estimating an estimated motion trajectory of the sensing target at a plurality of moments according to the speed and the speed direction of the sensing target, for example, estimated motion trajectories at moments tn, n=1, 2, 3, . . . , N, where each trajectory point in the estimated motion trajectory corresponds to a moment.
The determining, based on the first candidate location and the estimated motion trajectory, an angle of arrival and/or an angle of departure of a dynamic reflection path corresponding to each trajectory point on the estimated motion trajectory may be assuming that the initial location of the sensing target is the first candidate location, thereby determining the location of each trajectory point on the estimated motion trajectory and calculating the angle of arrival and/or the angle of departure of the dynamic reflection path of each trajectory point to the location of the first device.
The second measurement quantity results reported by the first device may include a second measurement quantity result at a moment corresponding to each trajectory point, namely, including second measurement quantity results at moments tn, n=1, 2, 3, . . . , N. The second measurement quantity result includes an angle of arrival APS and/or an angle of departure APS. In this case, the angle of arrival and/or the angle of departure of the dynamic reflection path corresponding to each trajectory point is substituted into the corresponding angle of arrival APS and/or the angle of departure APS to determine a power value corresponding to each trajectory point. The power value is associated with a trajectory location confidence of the trajectory point, for example, the trajectory location confidence of the trajectory point is equal to a product of the power value and a weight of the first device. Alternatively, when the weight is not set, the trajectory location confidence of the trajectory point is equal to the power value corresponding to the first device.
The determining, by the third device, an initial location confidence corresponding to the first candidate location according to the trajectory location confidence corresponding to each trajectory point on the estimated motion trajectory may be using a product of the trajectory location confidences corresponding to the trajectory points as the initial location confidence of the first candidate location, or a sum of the trajectory location confidences corresponding to the trajectory points as the initial location confidence of the first candidate location.
In this implementation, the initial location confidence of each candidate location can be calculated, which can improve readiness of the initial location of the sensing target.
Optionally, the target information further includes:
In this implementation, where the trajectory location confidence is determined by a plurality of first devices, a corresponding weight may be determined for each first device.
That the trajectory location confidence is positively correlated with a first value of each first device may be that when the first value of the first device is large, the trajectory location confidence is large; and when the first value of the first device is small, the trajectory location confidence is small. For example, the trajectory location confidence is equal to a sum of the first values of the plurality of first devices, or the trajectory location confidence is equal to a product of the first values of the plurality of first devices.
In this implementation, due to the addition of the weight of each first device, accuracy of the initial location of the sensing target can be further improved.
Optionally, the weight corresponding to each first device is determined for the third device based on device information of the first device, where
The status information includes at least one of:
In some implementations, the weight of the first device may alternatively be reported by the first device to the third device.
The determining of the initial location of the sensing target is described below by taking
As long as any two of UE 1, UE 2, and the base station determine an angle of arrival/angle of departure of a reflected signal of the sensing target for any moment in the trajectory tracking process, an intersection of extension lines of UE 1, UE 2, and the base station as starting points along an estimated angle direction is a pedestrian location. However, because a sensing capability of the UE is weak, and different UEs may have different sensing capabilities, the pedestrian location obtained through collaborative sensing of UEs (or collaboration between the base station and the UE) is estimated to be a wide area. A greater quantity of UEs participating in sensing indicates a higher confidence of an overlapping area of estimated areas between all UEs.
It is assumed that at a moment tn, n=1, 2, 3, . . . , N, channel angle power spectrum APSs obtained by UE1, UE2, and the base station is Pu1(φ1, tn), φ1∈[φmin1, φmax1], Pu2(φ2, tn), φ2∈[φmin2, φmax2], and P0(φ0, tn), φ0∈[φmin0, φmax0]. In the previous speed and trajectory estimation for the sensing target, because the speed and the speed direction of the target at each moment can be estimated, the trajectory shape of the target within a period of time tn, n=1, 2, 3, . . . , N can be obtained. A sensing area is divided into a grid map as shown in
λu1(tn), λu2(tn), λ0(tn) are weight coefficients for UE 1, UE 2, and the base station respectively, where the weight coefficients reflect the (measurement) confidence of the foregoing sensing node, and are determined by the third device according to prior information such as a sensing capability of the UE (or the base station) participating in sensing, the method used for location determining of the UE (or the base station), coordinate location information of the UE (or the base station), or array orientation information of the UE (or the base station), or may be self-determined and reported by collaborative sensing UEs (base stations) according to the foregoing information. The UE (or base station) sensing capability may include system configuration information such as UE (or base station) sensing bandwidth and a UE (or base station) antenna quantity, and the method used for location determining can represent information of UE (or base station) location accuracy. In addition, the weight coefficient takes a value range of (0, 1) or (0, a non-negative value), and a larger value indicates a higher confidence of the sensing result representing the corresponding sensing UE (base station). The weight may be a fixed value during tracking, or may be a variable within a range of values, that is, may be associated with time or a sensing target spatial location. To make a comprehensive consideration of confidences of all N estimated locations on the entire trajectory to determine the most possible trajectory starting location, a confidence of the initial location (xi, yi) is defined as
Cpositionn n=1, 2, 3, . . . , N represents the location confidence of different measurement moments (that is, different trajectory points), and Ctracei represents the confidence of the initial location (xi, yi).
It should be noted that, Cpositionn may be location confidences of different measurement moments (that is, different trajectory points) determined in either of the two manners described above. Alternatively, with the defined location confidence Cpositionn as an independent variable, it is obtained by using a pre-set function that is a monotonically increasing function of the independent variable.
Assuming that a total of I grids are divided in the sensing area, all grids are traversed to calculate Ctracei, i=1, 2, . . . , I, and a grid corresponding to a maximum trajectory confidence trace obtained is the most possible initial location of the sensing target pedestrian. As shown in
It should be noted that in this embodiment of this application, only two first device is required to obtain a speed vector vperson of the sensing target when determining the speed and the trajectory of the sensing target. When the quantity of collaborative sensing first devices is larger, the third device or the network side device may pick two first devices with the highest confidence according to grasped device information of the first devices to perform measurement to obtain the vperson result, or a larger quantity of first devices are used for measurement to obtained the result of vperson comprehensively.
In addition, in this embodiment of this application, the foregoing manner of calculating the confidence of the initial location of the sensing target may be applied to trajectory tracking of the sensing target. For other collaborative sensing services of the sensing target, when a plurality of first devices can be used for collaborative sensing, the foregoing manner can be used to quantify a confidence of a measurement quantity result of the first device to improve the sensing accuracy. The collaborative sensing herein means that the same one or more measurement quantities can be measured by a plurality of different first devices to achieve a sensing purpose (that is, to obtain a final sensing result), and the final sensing result is synthetically determined based on measurement quantity results of the plurality of first devices.
Optionally, the second measurement quantity is a suppressed second measurement quantity obtained by the first device by suppressing interference energy other than a dynamic reflection path spectrum peak in the second measurement quantity.
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the receiving, by the third device, second measurement quantity results sent by at least two first devices includes:
The determining, by the third device, a current location of the sensing target according to the initial location, the first measurement quantity results, and the second measurement quantity results includes:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the second measurement quantity includes at least one of:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the method further includes:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the method further includes:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the parameter configuration information includes at least one of:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the method further includes:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the device information includes at least one of:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the status information includes at least one of:
For this implementation, refer to related descriptions in the embodiment shown in
Optionally, the determining, by the third device, devices participating in collaborative sensing in the plurality of devices according to the device information sent by the plurality of devices includes:
The sensing area information may be sensing area information determined by a sensing demanding party, or the sensing area information may be pre-configured.
The determining the devices participating in collaborative sensing in the plurality of devices according to the device information sent by the plurality of devices and sensing area information may be determining devices that match the sensing area information in the plurality of devices as the devices participating in the collaborative sensing, such as determining at least two first devices, and a second device. Alternatively, a corresponding first device and a second device are allocated to each sub—are in the sensing area.
Optionally, the method further includes:
The allocating devices participating in collaborative sensing to the sensing target may be allocating at least two first devices and a second device to the sensing target, for example, allocating devices participating in the collaborative sensing in a sub-area in which the sensing target is located to the sensing target.
Optionally, the allocating, by the third device, devices participating in collaborative sensing to the sensing target from among the determined devices participating in collaborative sensing includes:
As shown in
Optionally, the allocation of the corresponding device to each sensing sub-area may be pre-configured.
The sensing sub-area in which the sensing target is located may be determined according to an initial location or may be determined according to prior information. This is not limited.
Optionally, the sensing sub-area includes: a network side device sensing sub-area and a terminal sensing sub-areas, at least one network side device is allocated to one network side device sensing sub-area, at least one terminal is allocated to one terminal sensing sub-area, and one network side device sensing sub-area covers at least one terminal sensing sub-area; and
In some implementations, the network side device sensing sub-area may alternatively not be set, for example, sensing positioning is performed on the sensing target by at least three terminals.
Optionally, in a case that a same terminal exists in terminals allocated to two terminal sensing sub-areas, the same terminal participates in collaborative sensing in the two terminal sensing sub-areas in a time division multiplexing manner, or a frequency division multiplexing manner, or a code division multiplexing manner.
In this way, the same terminal can participate in collaborative sensing of a plurality of sensing sub-areas.
Optionally, the method further includes:
The updating devices participating in collaborative sensing for the sensing target may be updating based on corresponding information of the sensing target, the first device, or the second device.
In this implementation, the accuracy of the sensing positioning result of the sensing target can be further improved by updating the devices participating in the collaborative sensing for the sensing target.
Optionally, the updating, by the third device, devices participating in collaborative sensing for the sensing target includes at least one of:
In this implementation, the terminal and the network side device participating in the collaborative sensing can be updated in time based on the first condition and the second condition described above.
Optionally, trajectory sensing of the sensing target is initiated by the third device based on a sensing requirement; and/or
The sensing requirement may be initiated by the third device, the first device, the second device, or the sensing target according to actual requirements. In addition, the sensing requirement may include a trajectory tracking initiation condition, and the trajectory tracking initiation condition may include at least one of:
The foregoing trajectory tracking ending condition ending may be determined by the third device, the first device, the second device, or the sensing target, or pre-configured. For example, the trajectory tracking ending condition may include at least one of:
It should be noted that this embodiment is an implementation of a corresponding network side device in the embodiment shown in
The method according to embodiments of this application is described below by using an example in which the sensing target is a pedestrian, the first device is a UE, a base station, or a small base station, the second device is a UE, a base station, or a small base station, and the third device is a core network device.
The positioning manner provided in this embodiment of this application is described in scenarios shown in
In an integrated sensing and communication scenario, motion trajectory tracking of the sensing target can be achieved by collaborative measurement of the sensing target by using stationary devices with sensing capabilities in a specific area. In this embodiment of this application, a device-free positioning method may be used to improve the sensing performance, but a device-based positioning method may also be flexibly combined to improve the sensing performance. The sensing scenario involved is shown in
In this embodiment of this application, the sensing positioning method provided can achieve the following effects:
The sensing computing power is dispersed over idle terminals, reducing the computing power burden on the base station or the network. The method has advantages in scenarios where terminals are dense, or where a base station sensing distance is limited. For example, the base station spontaneously self-collects through echo sensing, and a QoS requirement cannot be achieved when a signal reaches the sensing target and is reflected back to the base station. However, the collaborative sensing terminals are distributed around the sensing target, and signals reflected to reach the sensing terminals still meet the sensing QoS requirement.
Compared to continuous positioning based on existing positioning methods, pilots can be greatly reduced, and device-free sensing has no restriction on whether the sensing target is a terminal.
Compared to single-station or multi-station radar continuous sensing (continuous distance measurement and angle measurement), it can be used jointly to improve accuracy as a complementary method. Meanwhile, the sensing positioning provided in embodiments of this application can alternatively be used independently of existing positioning methods.
When the sensing target is also a terminal (that is, having the ability to communicate with a base station), the sensing positioning provided in embodiments of this application can alternatively be flexibly combined with other positioning methods to further improve the sensing accuracy.
Taking the scenario in which the terminal 1, the terminal 2, and the sensing target 1 (pedestrian) on the right side of the two scenarios in the figure as an example, as shown in
As shown in
The pedestrian motion speed can be resolved into radial and tangential speeds at the reflection point of the dynamic reflection path. As shown in
and
Based on the foregoing analysis, for the pedestrian-to-terminal 1 reflection path,
and
In the foregoing equations (4) to (6), the locations of the base station, the terminal 1, and the terminal 2 are known, that is, the vectors p0, p1, and p2 are all known, and further, it is assumed that at the moment the pedestrian location vector pperson is also known. The total length change speeds vpath1, vpath2 of the reflection paths may be calculated by each terminal downlink received sensing signal or base station uplink received sensing signal based on multiple signal classification (MUSIC) or other Doppler estimation algorithms. A relationship between the Doppler frequencies fd1, fd2 estimated by the terminal 1 and the terminal 2 and the total length change speeds vpath1, vpath2 of the reflection paths is
λ is a signal wavelength. The foregoing equations (equations (4) to (7)) have a total of 5 scalar unknowns, which are respectively radial speeds (change speeds) vpath0, vpath1, vpath2 of three reflection paths and two scalar unknowns (that is, two-dimensional coordinates of the speed vector) corresponding to vperson. In this regard, in this embodiment of this application, only the radial speed component, which has an impact on the reflection path length, is concerned, while the tangential speed component does not have an impact on the reflection path length and is therefore not considered. Therefore, equations (4) to (6) are substituted into equation (7) to obtain vperson.
Further, assuming that two adjacent times of measurement are sufficiently short in time (for example, 5 to 10 ms) relative to pedestrian motion, the pedestrian can be approximated as moving in a straight line at a uniform speed in this period, so that a location of the pedestrian at the next moment can be predicted. Therefore, for N consecutive times of measurement in time dimension, given a specific initial location, motion trajectory coordinates of the sensing target are obtained.
This embodiment is described by using trajectory tracking as an example. As shown in
Step 801: Initiate a sensing requirement.
The sensing requirement is sent to a core network (sensing network function or sensing network element). A sensing requirement initiator may be a UE, or a base station, or a sensing target itself, or a third-party application server other than an access network and a core network. The sensing requirement may include at least one of the following information:
The trajectory tracking initiation condition includes at least one of:
Step 802: Determine base stations and UEs that participate in sensing.
This step may be to determine base stations for sensing/integrated sensing and communication services and collaborative sensing UEs. The determining base stations participating in sensing and collaborative sensing UEs herein refers to determining base station and UE candidate sets within a wide range, equivalent to coarse grained base station and UE selection. Subsequently, in step 803, the sensing area may be divided into smaller areas and collaborative sensing UEs may be divided into groups, where each group of UEs are mainly responsible for one smaller area.
In this step, the core network (sensing network function or sensing network element) may determine the UEs and base stations participating in collaborative sensing based on sensing requirements and status information reported by the base stations and the UEs. For example, the following manners may be included:
Manner 1. The core network determines the base stations participating in the sensing service based on the sensing requirements and the status information reported by the base stations. There may be one or more base stations participating in sensing for a specific sensing area in which the sensing target is located.
Base station status information is to include at least one of the following information: base station sensing capability indication information (for example, maximum bandwidth and time-frequency resources of the base station that can be used to support sensing, a base station antenna quantity, a maximum sensing distance of the base station, a maximum communication coverage distance of the base station, base station location information, a base station location information determining method (or equivalently, information characterizing base station location accuracy), base station panel orientation and inclination angle information, beamforming configuration information, and the like).
Manner 2: The base stations participating in sensing and determined by the core network broadcast control information carrying sensing requirements and UE status information reporting requests for the sensing area. UEs in the coverage area of the base station report the UE status information.
The core network determines the UEs participating in the sensing service according to the sensing requirements and the status information reported by the UEs (including a UE communication & sensing status, UE sensing capability information, and other prior information (for example, a UE location)). Alternatively, the base station participating in sensing determines the UEs participating in the sensing service according to the status information reported by the UEs. Alternatively, the core network determines a part of the UEs participating in the sensing service according to the status information reported by the UEs, and the base station participating in sensing determines another part of the UEs participating in the sensing service according to the status information reported by the UEs. The UEs participating in collaborative sensing may also be other types of sensing nodes with equivalent functionality, such as small base stations.
The UE status information is to include: UE location information (that is, UE location coordinates), a UE location information determining method (or equivalently, information characterizing UE location accuracy), a UE motion status indication (that is, whether the UE is currently stationary or not), and UE panel orientation and inclination angle information.
Optionally, the UE status information may further include at least one of the following information: UE sensing capability indication information (for example, maximum bandwidth and time-frequency resources of the UE that can be used to support sensing, a UE antenna quantity, and a maximum sensing distance of the UE), a UE communication status indication (that is, whether a communication service is currently performed), a UE sensing status indication (for example, whether a current period supports collaborative sensing, a time period supporting collaborative sensing, and whether a sensing service is currently performed), UE beamforming configuration information, and the like.
The method of determining the UEs participating in collaborative sensing may be one of:
It should be noted that the accuracy of the sensing target trajectory tracking is closely related to the collaborative sensing UE location accuracy, and the network may select more distributed UEs in the sensing area to participate in the collaborative sensing, as much as possible to guarantee that the collaborative sensing UEs are distributed at different locations of the sensing target.
Manner 3: The base station reports determined information about the UEs participating in collaborative sensing to the core network (sensing network function or sensing network element). The information obtained by the latter is to include at least one of: collaborative sensing UE IDs, collaborative sensing UE location information, a collaborative sensing UE location information determining method (or equivalently, information characterizing UE location accuracy), a total quantity of collaborative sensing UEs in the sensing area, collaborative sensing UE status information, and the like.
Step 803: UE positioning.
This step may be that the core network determines whether to initiate positioning for a part of the UEs, where when the decision is yes, a UE positioning procedure is initiated. For example, through steps 801 and 802, there may be a situation where location information of a part of the UEs participating in collaborative sensing is missing. In this case, optionally, the core network initiates a positioning procedure for the part of the UEs to obtain UE location information. A positioning method used may be an LTE or NR positioning method, or another positioning method.
Upon completion of this part of UE positioning procedure and acquisition of this part of UE location information, the base station or the UE reports this part of UE location information, this part of UE positioning method (or equivalently, information characterizing the location accuracy of the UE), and other status information of this part of UEs to the core network as previously described. The core network finally determines all UEs participating in collaborative sensing.
It should be noted that it is assumed that the core network (sensing network function or sensing network element) has previously stored a unified map coordinate system, which is used for location information of all base stations, UEs, and other sensing terminals participating in sensing in the sensing area.
It should be noted that via steps 802 and 803, the core network (sensing network function or sensing network element) can determine weight coefficients of the UEs according to the status information reported by the base station and the UEs, that is, (measurement) confidences as sensing nodes. Optionally, the base station and the UE may also self-determine confidences according to own status information and report the confidences for reference or use by the core network.
Step 804: Determine base stations participating in sensing, and perform collaborative sensing UE grouping.
This step may be performed by the core network (sensing network function or sensing network element), and the ultimate goal is to allocate base stations participating in sensing and collaborative sensing UEs to sensing targets in the sensing area. The allocation of base stations and UEs may be allocation to the same target at different intervals of the trajectory or to different sensing targets. Specifically, the following manners may be included:
Manner 1. Sensing sub-area division of the sensing area. A sensing sub-area is a smaller physical area within a sensing area. The division (location and size) of the sensing sub-areas may be determined according to information about a quantity and density of sensing targets within the sensing area in the sensing requirement, and/or UE status information, such as a maximum sensing distance of the UE, and/or information about UEs participating in collaborative sensing within the sensing area grasped by the core network, and/or base station status information, such as a maximum sensing distance of the base station or a maximum communication coverage distance, and/or information about base stations participating in collaborative sensing within the sensing area grasped by the core network. If the foregoing information is not available, division may alternatively be performed according to a preset default value (for example, evenly divided or divided according to historical trajectory tracking service division results).
Optionally, the sensing sub-area may be divided in two levels, corresponding to sub-area division of the base station and the UE (respectively referred to as a base station sensing sub-area and a UE sensing sub-area below). The physical area sizes of the two levels of division may be different. Generally, a quantity of base stations is less than that of UEs, but a coverage area of the base stations is larger, to support a larger sensing distance. Therefore, the physical range of the base station sensing sub-area is typically larger than that of the UE sensing sub-area. One base station sensing sub-area may include one or more UE sensing sub-areas, and base station sensing sub-areas may be physically discontinuous. As shown in
Manner 2. Allocate base stations participating in sensing to each base station sensing sub-area, and allocate a group of collaborative sensing UEs to each UE sensing sub-area. The allocated UEs are from the base stations participating in sensing and the collaborative sensing UE set determined in step 802.
The allocation of collaborative sensing UEs to the UE sensing sub-area may be based on at least one of the following items in the UE status information: UE location information, UE sensing capability indication information, a UE sensing status indication, or a UE communication status indication. One sensing sub-area corresponds to one UE group, and a total quantity of collaborative sensing UEs within one UE group is at least two.
There may be one or more base stations participating in sensing within the sensing area, and there may be one or more base stations participating in sensing in one base station sensing sub-area. The allocation of the base stations participating in sensing to the base station sensing sub-area may be based on at least one of the following items in the base station status information: base station location information, base station sensing capability indication information, a base station sensing status indication, or a base station communication status indication.
The UE group is associated with the sensing base stations, and the association may be based on at least one of: a base station sensing sub-area division result, a UE sensing sub-area division result, one or more of the base station status information, or one or more of the UE status information. The core network issues an association result to the base stations participating in sensing. Optionally, the core network forwards the association result to the collaborative sensing UE group.
Alternatively, considering randomness of UE distribution, some sensing sub-areas may not have a sufficient quantity of collaborative sensing UEs. In this case, the base station may schedule adjacent UEs in other UE groups that meet the sensing requirement for collaborative sensing in a time division multiplexing (or frequency division multiplexing or code division multiplexing) manner. As shown in
It should be noted that the reasons for dividing the base station and UE sensing sub-areas and the sensing blind areas, group the sensing UEs, and associating UEs with base stations may include:
1: The sensing target is mobile, the range of movement may be larger than the maximum sensing range of the UE, or even the base station, requiring grouping of UEs within the area, for sensing of the target trajectory in different trajectory segments (sub-areas) with different UE groups and/or different base stations.
2: The sensing service may be to sense a plurality of targets in a large area, so that grouping of the UEs is to avoid scheduling collisions as much as possible, that is, to avoid that the same UE may be assigned to sensing different targets at the same time.
3: Generally, the initial location of the sensing target is not known (only confirmed to be in the sensing area), dividing the sub-areas facilitates the determining of the initial collaborative sensing UE grouping and the associated base stations participating in sensing, reducing the search range for the subsequent estimation of the target initial location, and reducing the complexity.
Step 805: The base stations and the collaborative sensing UEs configure sensing-related parameters.
After the network completes the collaborative sensing UE grouping, the sensing network function/sensing network element sends sensing/integrated sensing and communication signal/NR reference signal related configuration parameter information to the base stations participating in sensing. The configuration parameter information of the UEs may be conveyed by the sensing network function/sensing network element through NAS signaling, or the sensing/integrated sensing and communication signal configuration parameter information may be sent by the sensing network function/sensing network element first to the base stations participating in sensing and then forwarded by the base stations to the UEs.
The configuration parameter information includes at least one of:
It should be noted that after matching of the node parameters is completed, optionally, the core network sends a sensing start indication to at least one base station and one collaborative sensing packet in the sensing area according to the sensing requirements and self-grasped prior information.
Step 806: The collaborative sensing UEs and/or the base stations perform Doppler frequency measurement and reporting.
In this step, the collaborative sensing UEs (or the base stations) perform Doppler frequency measurement, where the measurement may use newly designed sensing signals or integrated sensing and communication signals, or reference signals currently used for NR, such as a primary synchronization signal (PSS) and a secondary synchronization signal (SSS) in a downlink synchronization signal and physical broadcast channel block (Synchronization Signals (SS) and Physical Broadcast Channel Block, SSB), a DMRS carried in a PBCH, or a DL-PRS. The foregoing sensing signals or integrated sensing and communication signals, or reference signals are continuously distributed in time domain, the distribution density of which determines the maximum Doppler frequency range that the UEs can measure. The duration of the sensing/integrated sensing and communication signals/NR reference signals used determines the resolution of the Doppler frequency for the UEs. Based on the sensing/integrated sensing and communication signals/NR reference signals, the UEs obtain a channel estimate on the time-frequency resource where the signal is located. The obtained channel estimate includes Doppler information of the dynamic reflection path caused by movement of the sensing target. An algorithm used by the UE to estimate the Doppler frequency may be FFT (including zero-padded FFT), a MUSIC algorithm, an ESPRIT algorithm, a space-alternating generalized expectation-maximization (SAGE) algorithm, or the like. The identification of the dynamic reflection path can be achieved by power variation signature identification based on a Doppler peak/Doppler path. After obtaining a measurement result, each UE reports a measurement result of a Doppler frequency or a change speed vpathn of a total length of the dynamic reflection path (according to the foregoing equation (7)) calculated to the base station.
Optionally, the dynamic reflection path Doppler of the sensing target may also be measured by the base station based on uplink sensing/integrated sensing and communication signals/NR reference signals. If NR reference signals are used, SRS signals can be estimated for uplink channels or UL-SRS signals originally designed for NR positioning can be used. The base station can obtain measurement results of the Doppler frequency and the change speed vpath (according to the foregoing equation (7)) of the total length of the reflection path of the plurality of UEs.
It should be note that there may be cases where a part of collaborative sensing UEs measure the Doppler frequency based on the downlink sensing/integrated sensing and communication signals/NR reference signals, and the other part of collaborative sensing UEs send uplink sensing/integrated sensing and communication signals/NR reference signals, and the base stations measure the Doppler frequency. In this case, the measurement results of the Doppler frequency of the downlink signal or the change speed vpathn (according to the foregoing equation (7)) of the total length of the dynamic reflection path are reported by the UE to the base stations, and then further forwarded by the base stations to the core network (sensing network function/sensing network element).
Optionally, when the Doppler frequency is measured, the UE records and saves a measurement timestamp, and reports the measurement timestamp to the base station along with the Doppler measurement result. If the conversion of the measurement quantity to the sensing result is performed at the core network (sensing network function or sensing network element), the base station reports the Doppler measurement result of each UE together with measurement timestamp information to the core network. If the measurement is periodic measurement (that is, two measurement time intervals of the vector are the same, for example, when periodic UL-SRSs or DL-PRSs are used), a measurement serial number and a measurement (sensing/integrated sensing and communication signals/NR reference signals) periodicity may be reported instead of the timestamp information.
It should be noted that trajectory tracking sensing involves measuring Doppler frequencies (or the change speed of the total length of the dynamic reflection path) for a plurality of times. The corresponding historical measurement results and corresponding timestamp information (or measurement serial numbers and periodicities) are reported to the core network (sensing network function or sensing network element) and then stored in the core network according to computational accuracy requirements for subsequent further computation and updating of the sensing target trajectory results.
Step 807: The collaborative sensing UEs and/or the base stations perform channel angle power spectrum (APS) measurement and reporting.
This step is performed simultaneously or time-shared with step 806. A quantity of UEs participating in channel APS measurement to locate the sensing target is greater than 1. When the quantity of UEs for APS measurement is 1, the base station side needs to perform APS measurement; and when the quantity of UEs for APS measurement is greater than 1, the base station side may not perform APS measurement. The angle power spectrum APS measurement method may be one of the following:
The collaborative sensing UEs perform channel APS measurement. In this case, the base stations do not participate in channel APS measurement, and only two or more UEs participating in collaborative sensing perform APS measurement. If the sensing/integrated sensing and communication signals/NR reference signals are sent through a downlink, each UE measures the AOA APS of the channel, and then each UE reports a measured AOA APS result to the base station and/or the core network.
The base stations participating in sensing perform channel APS measurement. In this case, the UEs collaboratively send only the uplink sensing/integrated sensing and communication signals/NR reference signals, and the AOA APS on the base station side and the AOD APS measurement results on the UE side are both obtained on the base station side.
The base stations participating in sensing and the collaborative sensing UEs, perform channel APS measurement, and the UEs report APS measurement results. This case is a combination of the two previously described cases, that is, the collaborative sensing UEs perform angle APS measurement on the UE side and the base stations participating in sensing perform angle APS measurement on the base station side. In this case, the total quantity of associated base stations and UEs is at least two, that is, at least one base station and one UE are required to complete the measurement. A base station associated with a UE refers to a base station that sends or receives sensing/integrated sensing and communication signals/NR reference signals with a collaborative sensing UE when performing trajectory tracking sensing.
The APS measurement for each node can be implemented based on own algorithms or can be obtained based on angle measurement methods and NR beam management methods in current NR positioning technologies:
If the APS measurement is implemented based on own algorithms, the base station and the UE may use FFT (including zero-padding FFT), commonly used spatial domain filters such as a Bartlett Beamformer, MVDR, MUSIC, and refinement algorithms thereof based on received sensing/integrated sensing and communication/NR reference signals, or uplink/downlink channel estimates derived from the signals.
The angle measurement and NR beam management methods in LTR or NR positioning technologies may include one of the following:
1. For downlink base station side AOD APSs, downlink DL-PRS beams may be sent (beam sweeping), and the collaborative sensing UEs may receive the DL-PRSs and perform DL-PRS RSRP measurement. The difference from the NR procedure is that the UE does not only feed back the maximum RSRP beam index information to the base station, but feed back the corresponding RSRP measurement result for each beam, from which the base station obtains the channel AOD APS.
2. For downlink UE side AOA APSs, if the UE has a beam sweeping capability and s strong beam forming capability, after the DL-PRS-based downlink base station side AOD is determined, the base station side fixes the optimal downlink beam (that is, a base station downlink beam corresponding to the maximum DL-PRS RSRP measured by the UE side), and the UE performs beam sweeping reception and measures the DL-PRS RSRP to obtain the channel AOA APS.
3. For uplink UE side AOD APSs, if the UE has a beam sweeping capability and a strong beamforming capability, the UE sends uplink UR-SRS beams (beam sweeping) and the base stations participating in sensing receive the UL-SRS signals and perform UL-SRS RSRP measurement. The difference from the NR procedure is that the base station does not only indicate the maximum RSRP beam index information to the UE, but sends the corresponding RSRP measurement result for each beam, from which the UE obtains the channel AOD APS.
4. For uplink base station side AOA APSs, the base station indicates the UE to fix the uplink UL-SRS beam (that is, the UE uplink beam corresponding to the maximum UL-SRS RSRP measured by the base station side) based on the UL-SRS RSRP measurement result, and the base station performs beam sweeping reception, and measures the UL-SRS RSRP to obtain the channel AOA APS.
It should be note that there may be cases where a part of collaborative sensing UEs measure channel UE side AOA and/or base station side AOD APSs based on downlink sensing/integrated sensing and communication signals/NR reference signals; and the other part of collaborative sensing UEs send uplink sensing/integrated sensing and communication signals/NR reference signals, and the base stations measure channel UE side AOD and/or base station side AOA APSs. In this case, APS measurement results of the UEs need to be reported to the base station, and then further forwarded by the base stations to the core network (sensing network function/sensing network element).
It should be noted that through the APS measurement method described above, the overall APS of the channel including the dynamic reflection path of the sensing target is obtained, that is, the APS obtained has a plurality of spectrum peaks, including an LOS path spectrum peak, another static reflection path spectrum peak, and a sensing target dynamic reflection path spectrum peak. Generally, a static reflection path spectrum peak and a dynamic reflection path spectrum peak do not overlap.
Optionally, to suppress or avoid an impact of the LOS path and other static reflection paths on the conversion process from measurement quantities to sensing results (step 808), the corresponding spectrum peak of the dynamic reflection path may be identified and tracked in the trajectory tracking sensing process by detecting fluctuations in spectrum peak power, other pattern recognition methods, or machine learning methods based on results of a plurality of times of measurement performed in time domain using the APS measurement method described above, and then the interference energy other than the dynamic reflection path spectrum peak may be suppressed in subsequent measurement, where the APS reported is an APS result obtained after suppression of the interference energy.
Optionally, the UE may choose to report the whole channel APS results, or report APS results within a set angle range (obtained through historical APS measurement or dynamic reflection path spectrum peak identification) corresponding to the dynamic reflection path of the sensing target, reducing reporting overheads for the UE.
Optionally, the core network (sensing network function or sensing network element) obtains location coordinates of the currently measured sensing target based on the Doppler frequency measurement result (step 807) and the trajectory initial location estimation result (step 808), further obtains an angle of arrival (or angle of departure, depending on whether the measurement is uplink or downlink) of the sensing target to each collaborative sensing UE or the base station, and then issues the angle of arrival (or angle of departure) to each collaborative sensing UE. Based on the result, the UE feeds back an APS measurement value for the corresponding angle to the core network.
It should be noted that the core network forwards the location coordinates of the sensing target based on the Doppler estimate, or scaled to an angle value, to the UE, and the UE reports the APS value based on this value. In this case, dynamic reflection path spectrum peak identification is not required.
It should be noted that when measuring the APS, the UE needs to record and save a measurement timestamp, and report the timestamp to the base station along with the APS measurement result. If the conversion of the measurement quantity to the sensing result is performed at the core network (sensing network function or sensing network element), the base station reports the APS measurement result of each UE together with measurement timestamp information to the core network. If the measurement is periodic measurement (that is, two measurement time intervals of the vector are the same, for example, when periodic UL-SRSs or DL-PRSs are used), a measurement serial number and a measurement (sensing/integrated sensing and communication signals/NR reference signals) periodicity may be reported instead of the timestamp information.
It should be noted that trajectory tracking sensing involves measuring APSs for a plurality of times. The corresponding historical measurement results and corresponding timestamp information (or measurement serial numbers and periodicities) are reported to the core network (sensing network function or sensing network element) and then stored in the core network according to computational accuracy requirements for subsequent further computation and updating of the sensing target trajectory results.
Step 808: The core network calculates a sensing result based on the measurement quantity result.
This step may be understood as the conversion of the measurement quantity result to the sensing result. After each node completes channel APS measurement, the APS results can be stored locally or reported to upstream nodes. For example, the UE reports the measured APS to the base station, or the base station reports own measured APS and received UE APS measurement results to the core network (sensing network function or sensing network element), and the core network performs the calculation and conversion of the measurement quantity to the sensing result. Optionally, the base station may alternatively perform the calculation and conversion of the measurement quantity to the sensing result based on own measured APS and/or received UE APS measurement result, and own status information and UE status information stored locally or issued by the core network. For convenience of description, the nodes for the conversion of measurement quantities to sensing results are hereinafter collectively referred to as computing nodes (which may be core networks or base stations).
The conversion of measurement quantity results to sensing results may include steps A and B as follows.
Step A: Trajectory initial location determining. Generally, an accurate initial location of the sensing target is not known until the trajectory tracking service proceeds. According to different cases, the accurate initial location of the sensing target may be specifically determined in one of the following manners:
Manner 1: The sensing target is not required to be a UE, and according to the measurement results of step 806 and step 807, an initial location where the trajectory of the sensing target is accurate is determined. If the computation of the trajectory initial location is done at the core network (sensing network function or sensing network element) and the APS measurement result of step 807 has been reported to the core network, the core network (sensing network function or sensing network element) may determine an approximate search range of the sensing target based on prior information of the sensing target, where the prior information of the sensing target includes at least one of:
After determining the approximate search range of the initial location of the sensing target, the computing node divides the search range into several search grid points. The size of the grid point is synthetically determined according to sensing capabilities of each collaborative sensing UE (for example, a quantity of antennas for UE angle measurement and sensing signal bandwidth).
Assuming, in turn, that the initial location of the sensing target is the divided grid points, according to the change speed vpath of the dynamic reflection path of the sensing target in step 806, the speed vector vobj of the sensing target is obtained based on equations (4) to (7) (where vobj corresponds to vperson in equations (4) to (6).
With reference to the speed vector vobj of the sensing target, the measurement timestamp information or the measurement periodicity, and the grid point locations described above, location coordinates of the sensing target can be obtained each time the measurement at step 806 is performed. Based on the location coordinates of the sensing target, location coordinates of the base stations participating in sensing and/or location coordinates of the collaborative sensing UEs, an angle (AOD or AOA) of the dynamic reflection path at the base stations participating in sensing side and/or the collaborative sensing UEs at the location coordinates of the sensing target is obtained.
The angle value is substituted into the corresponding formula for calculating the location confidence in the embodiment shown in
Manner 2: The sensing target is not required to be a UE, and the base stations participating in sensing or the collaborative sensing UEs determine the initial location of the sensing target based on spontaneous self-collected sensing signals. In this case, the node (one of the base stations or the collaborative sensing UEs) performing the target initial location sensing needs to temporarily occupy more time domain resources (that is, the density and the quantity of repetitions of the sensing/integrated sensing and communication signals/reference signals in time domain and the length of time covered need to be increased), frequency domain resources (that is, the distribution density of the sensing/integrated sensing and communication signals/reference signals in frequency domain and the frequency range covered need to be increased), and spatial domain resources (that is, the quantity of antennas used for sensing and the antenna array aperture need to be increased).
The determining of the specific sensing node is determined by the core network (sensing network function or sensing network element) based on the sensing capability indication information reported by each node (refer to step 802). If the spontaneous self-collecting sensing node is a base station, the core network indicates the base station to perform spontaneous self-collecting sensing signal sensing, a specific algorithm is implemented by the base station, and the base station reports obtained sensing target location information to the core network; and if the spontaneous self-collecting sensing node is a collaborative sensing UE, the core network indicates the UE with the strongest sensing capability for spontaneous self-collecting sensing signal sensing, a specific algorithm is implemented by the UE, and the UE reports obtained sensing target location information to the core network.
The above procedure may be performed by only one base station or one UE, by one base station and one or more UEs with results being reported separately, or by only a plurality of UEs with results being reported separately. The core network (sensing network function or sensing network element) then determines the final initial location coordinates of the sensing target synthetically.
Manner 3: The sensing target is required to be a UE, and the initial location of the sensing target is determined based on an NR positioning method. Whether the sensing target is a UE is indicated in a sensing requirement. When the sensing target is also a UE, the core network may determine to initiate sensing target positioning.
Manner 4: the sensing target is required to be a UE, and the initial location is determined by GPS; or the sensing target is not required to be a UE, and the initial location is determined by such methods as Bluetooth or UWB.
It should be noted that any two or three of the methods for determining the initial location of the sensing target may be used in combination to further improve the sensing accuracy.
Step B: Calculate a current location of the sensing target based on a set confidence criterion. The methods of step A above may all be performed after the first measurement for the trajectory tracking sensing (steps 806 and 807 are performed for the first time). In addition, for Manner 1 provided in this embodiment of this application, the core network (sensing network function or sensing network element) may re-retrieve the stored Doppler frequency (or the dynamic reflection path length change speed) and historical channel APS measurement results during a plurality of times of subsequent measurement to correct and update the estimated trajectory initial location and overall trajectory coordinates.
The current location of the sensing target is jointly determined based on the initial location determined in step A, historically calculated one or more speed vectors vobj of the sensing target, and the currently measured APS. Strictly speaking, each time APS measurement is added, the trajectory is updated and the current location coordinates of the sensing target are determined.
Further, for the current measurement, location coordinates of the sensing target at a next measurement moment can be predicted based on the location of the sensing target and a speed vector vobj of the sensing target currently calculated.
The confidence criterion may be specifically referred to the location confidence and the initial location confidence of the sensing target, and weight coefficients of various collaborative sensing UEs described in the embodiment shown in
Step 809: Forward sensing results.
This step may be that the trajectory tracking sensing service corresponds to a continuous period of time, and the sensing results may be a real-time feedback according to the sensing requirement, or may be a whole feedback after the trajectory tracking sensing is completed. After the sensing service starts, and after steps 801 to 805 are performed, steps 806 to 808 need to be iteratively performed to generate and output the current location of the sensing target (and the predicted location of the sensing target at the next moment).
The trajectory tracking ending condition may include at least one of:
If real-time feedback of the trajectory tracking sensing results is required, the computing node sends the current latest trajectory tracking sensing results (either the current sensing target location results or results including historical trajectories of the sensing target) to the sensing demanding party through the core network each time step 808 is completed. If the overall trajectory results are fed back after the completion of the trajectory tracking sensing service, the core network temporarily stores the historical trajectory tracking results and sends the results to the sensing demanding party at the end of the sensing service.
It should be noted that the embodiment shown in
This embodiment mainly describes switching of a collaborative sensing UE group and/or base stations participating in sensing, where the switching may be triggered by target movement or environmental changes.
As can be learned from the detailed description of the trajectory tracking procedure, because the object of the sensing service proceeding process moves, the target may move out of the sensing range of the original collaborative sensing UE group. In this case, the network needs to allocate a new collaborative sensing UE group and even new base stations participating in sensing to the sensing target. The new collaborative sensing UE group may include some of the UEs within the original collaborative sensing UE group. When new sensing base stations are allocated, a new collaborative sensing UE group may be allocated simultaneously, or the original collaborative sensing UE group may be used, that is, the core network re-associates the collaborative sensing UEs with the base stations.
The condition triggering switching of the base stations participating in sensing may be at least one of the following:
It should be noted that in this embodiment of this application, in addition to a common case where there is only one base station in a base station sensing sub-area, there may be a case where a quantity of sensing UEs is 1 and a quantity of base stations participating in sensing is 2.
The condition triggering switching of the collaborative sensing UE group may include at least one of the following:
A procedure of switching a base station participating in sensing may include:
(1) If the switching condition is met, switching of the base station participating in sensing is triggered.
If the node detecting the trigger condition is the original sensing base station (as described in the third condition and the fourth condition above), the original base station sends a sensing base station switching request to the core network.
The node detecting the trigger condition may alternatively be the core network (as described in the first, second and fifth conditions above).
(2) Manner 1: The core network determines a new base station participating in sensing (optionally, steps 802, 804, and 805 may be performed), and sends a sensing switching preparation indication to the new sensing base station. Alternatively,
Manner 2: The core network determines a new base station participating in sensing (optionally, steps 802, 804, and 805 may be performed) and sends an ID of the new sensing base station to the original sensing base station, and the original sensing base station sends a sensing switching preparation indication to the new sensing base station.
(3) Manner 1: The new sensing base station and the collaborative sensing UE group prepare for switching, and after preparation, report a switching preparation success indication to the core network.
Manner 2: The new sensing base station and the collaborative sensing UE group prepare for switching, and after preparation, send a switching preparation success indication to the original sensing base station and the core network.
After receiving the witching preparation success indication, the core network sends a sensing start indication to the new sensing base station and an associated collaborative sensing UE group.
(4) The new sensing base station and the collaborative sensing UE group perform sensing, and report sensing measurement quantity results to the core network (perform steps 806 and 807 above). Optionally, at least one of the new sensing base station and the new collaborative sensing UE group sends a sensing start indication response to the core network.
(5) After receiving the sensing measurement quantity results or the sensing start indication response reported by the new collaborative sensing UE group, the core network sends a sensing stop indication to some or all of the UEs in the original collaborative sensing UE group (which may be sent through NAS signaling, or via the base station).
(6) After some or all of the UEs in the current collaborative sensing UE group receive the sensing stop indication, the sensing measurement is stopped and the switching is completed.
A specific collaborative sensing UE group switching procedure may include:
(1) If the switching condition is met, switching of the collaborative sensing UE group is triggered.
If the node detecting the trigger condition is a collaborative sensing UE and/or a base station participating in sensing (as described in the third condition and the fourth condition above), the corresponding UE or base station sends a sensing UE group switching request to the core network.
The node detecting the trigger condition may alternatively be the core network (as described in the first, second and fifth conditions above).
(2) The core network determines a new collaborative sensing UE group (optionally, steps 802 to 805 may be performed) and sends a sensing start indication to the new collaborative sensing UE group (may be sent through NAS signaling, or via the base station).
(3) The new collaborative sensing UE group performs collaborative sensing, and reports sensing measurement quantity results (perform steps 806 and 807 above). Optionally, the UEs within the new collaborative sensing UE group send a sensing start indication response to the core network.
(4) After receiving the sensing measurement quantity results or the sensing start indication response reported by the new collaborative sensing UE group, the core network sends a sensing stop indication to some or all of the UEs in the original collaborative sensing UE group (which may be sent through NAS signaling, or via the base station).
(5) After some or all of the UEs in the current collaborative sensing UE group receive the sensing stop indication, the sensing measurement is stopped and the switching is completed.
It should be noted that if the sensing target enters a sensing blind area (based on the results of the trajectory tracking and the results of the blind area division at step 804), end of the trajectory tracking sensing service may be triggered, or another trajectory tracking sensing process (for example, trajectory tracking based on spontaneous self-collection of sensing signals by the sensing node, or NR continuous positioning, or GPS/Bluetooth/UWB based trajectory tracking) is switched.
This embodiment mainly describes failure and supplement of collaborative sensing UEs, where the failure and supplement may be triggered due to own causes of the UEs.
During the trajectory tracking sensing service, the collaborative sensing UEs may not continue to support collaborative sensing due to own causes of the UEs. In this case, the network needs to make a failure decision on the collaborative sensing UEs and remove failed collaborative sensing UEs, and also needs UE supplement on the current collaborative sensing UE group if necessary.
A triggering condition for failure of a collaborative sensing UE may include at least one of the following:
The movement of the UE may be detected by a device-based positioning method, such as LTE positioning, NR positioning, GPS, Bluetooth, or UWB.
A specific collaborative sensing UE failure and supplement procedure may be:
(1) If the failure condition is met, the relevant UE sends a sensing UE failure indication to the core network.
(2) The core network receives the sensing UE failure indication and determines a new available collaborative sensing UE (optionally, steps 802 to 805 may be performed); and sends a sensing stop indication to the failed UE.
(3) If a new available collaborative sensing UE is currently present and needs to be supplemented, the core network sends a sensing start indication to the newly determined collaborative sensing UE (which may be sent through NAS signaling, or via the base station).
This embodiment mainly describes measurement confidence adjustment for the collaborative sensing UEs and/or the base stations.
The measurement confidence of the collaborative sensing UE in the sensing service process is reflected by the weight coefficients in the corresponding equations in the embodiment shown in
The condition under which the weight coefficients change may be:
Available sensing resources of the collaborative sensing UEs may change. For example, the UEs obtain more resources (or less resources) in time domain (corresponding to more symbols that can be occupied in time domain for sensing/integrated sensing and communication signal/NR reference signal transmission)/frequency domain (corresponding to larger sensing/integrated sensing and communication bandwidth)/spatial domain (corresponding to more antenna ports/quantity of antennas for sensing/integrated sensing and communication) during the sensing service, and the sensing capabilities of the UEs change.
In this embodiment of this application, the accuracy of the measurement quantities of the collaborative sensing UEs is related to the accuracy of the locations of the collaborative sensing UEs, and if the collaborative sensing UEs update the locations using a more precise positioning method, the UE measurement confidence also needs to be adjusted.
In this embodiment of this application, the accuracy of the measurement quantities of the collaborative sensing UE is related to the location of the sensing target. For example, for Doppler frequency measurement, the measurement accuracy is higher when a distance between the sensing target and the base station or each collaborative sensing UE satisfies a far-field condition. For APS measurement, the measurement accuracy is higher when the sensing target faces a multi-antenna panel of the UE.
The measurement quantity accuracy of the collaborative sensing UEs is also related to a signal to noise ratio (SNR) at the collaborative sensing UE side. For example, a higher UE-measured SNR indicates higher measurement accuracy, and corresponding a higher measurement confidence.
Based on the definition of the corresponding equations in the embodiment shown in
The measurement confidence may be adjusted by each sensing node reporting updated weight coefficient recommended values to the core network, or may be self-adjusted by the core network.
This embodiment mainly describes use with satellite positioning, a third generation partnership project (3GPP) positioning technology, improving positioning and trajectory tracking.
The biggest problem with existing outdoor GPS positioning and trajectory tracking is that it is easily occluded by tall buildings, resulting in weak GPS signals, which in turn results in less accurate partial area positioning or partial road trajectory tracking, and even impossible GPS positioning and trajectory tracking services. In addition, the 3GPP positioning scheme is limited by large outdoor macro station deployment intervals, and positioning accuracy is also limited. In this embodiment of this application, a GPS location before occlusion may be used as an initial location to enable continuous positioning and trajectory tracking of a sensing target in an area with GPS signal occlusion or poor GPS signal coverage as a complement to existing trajectory tracking methods.
Specifically, it is generally considered that there is still a sufficiently dense distribution of base stations and UEs in an area with GPS signal occlusion or poor GPS signal coverage. When the sensing target is about to enter the area with GPS signal occlusion or poor GPS signal coverage, the trajectory tracking sensing of the method described in this patent may be switched to. In this case, GPS positioning information of the sensing target may be used as initial location information of trajectory tracking of the method. When the sensing target moves out of the area with poor GPS signal coverage, GPS trajectory tracking may be switched back. By using this method, the overall performance of the trajectory tracking service is improved.
Embodiments of this application provide a trajectory tracking sensing method based on inter-device collaboration other than LTE positioning and NR positioning. According to this method, device-free trajectory sensing of the sensing target is implemented based on Doppler and angle power spectrum measurement by a plurality of UEs or base stations by using stationary UEs or base stations around the sensing target. The specific implementation steps of the sensing method are provided; the necessary signaling interaction processes between the collaborative sensing UEs, the base stations, and the core network sensing network function are provided; the definitions of the measurement quantity and the sensing result confidence in this method are provided, and the use method is described in conjunction with the embodiments; and the switching methods and procedures of the collaborative UEs and the collaborative base stations are provided.
The target positioning sensing method according to embodiments of this application may be performed by a target positioning sensing apparatus. In this embodiment of this application, the target positioning sensing method provided in embodiments of this application is described by taking a transmission determining method performed by the target positioning sensing apparatus as an example.
Refer to
Optionally, the first device includes:
The second device includes:
Optionally, the positioning sensing result includes at least one of:
Optionally, the first signal includes one of:
Optionally, the sending the first measurement quantity result includes:
Optionally, the sending the first measurement quantity result includes:
Optionally, the apparatus further includes:
Optionally, the performing APS measurement on a wireless channel in which the sensing target is located, to obtain a second measurement quantity result of a second measurement quantity includes at least one of:
Optionally, the apparatus further includes:
Optionally, the sending the second measurement quantity result includes:
Optionally, the second measurement quantity includes one of:
Optionally, the apparatus further includes:
Optionally, the third device includes one of:
Optionally, the apparatus further includes:
Optionally, the parameter configuration information includes at least one of:
Optionally, the apparatus further includes:
Optionally, the device information includes at least one of:
Optionally, the status information includes at least one of:
Optionally, the reporting device information of the first device to a third device includes at least one of:
The target positioning sensing apparatus described above can improve positioning capabilities of the communication system.
The target positioning sensing apparatus in this embodiment of this application may be an electronic device, such as an electronic device with an operating system, or a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal or another device other than a terminal. For example, the terminal may include, but is not limited to, the types of terminals listed in embodiments of this application, and the another device may be a server, a network attached storage (NAS), or the like. This is not specifically limited in this embodiment of this application.
The target positioning sensing apparatus provided in this embodiment of this application can implement the processes implemented in the method embodiment shown in
Refer to
Optionally, the first device includes:
The second device includes:
Optionally, the positioning sensing result includes at least one of:
Optionally, the first signal includes one of:
Optionally, the receiving first measurement quantity results sent by at least two first devices includes:
The determining a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices includes:
Optionally, the receiving first measurement quantity results sent by at least two first devices includes:
Optionally, the apparatus further includes:
Optionally, the initial location of the sensing target includes:
Optionally, the initial location of the sensing target determined based on the device-free technology includes:
Optionally, the initial location of the sensing target determined based on the angle information of the sensing target is determined in the following manner:
determining, by the third device, a location with a greatest confidence in the plurality of candidate locations as the initial location of the sensing target.
Optionally, the calculating, by the third device, a confidence of each candidate location in a plurality of candidate locations in an initial location search area includes:
Optionally, the target information further includes:
Optionally, the weight corresponding to each first device is determined for the third device based on device information of the first device, where
Optionally, the status information includes at least one of:
Optionally, the second measurement quantity is a suppressed second measurement quantity obtained by the first device by suppressing interference energy other than a dynamic reflection path spectrum peak in the second measurement quantity.
Optionally, the receiving second measurement quantity results sent by at least two first devices includes:
The determining a current location of the sensing target according to the initial location, the first measurement quantity results, and the second measurement quantity results includes:
Optionally, the second measurement quantity includes at least one of:
Optionally, the apparatus further includes:
Optionally, the apparatus further includes:
Optionally, the parameter configuration information includes at least one of:
Optionally, the apparatus further includes:
Optionally, the device information includes at least one of:
Optionally, the status information includes at least one of:
Optionally, the determining devices participating in collaborative sensing in the plurality of devices according to the device information sent by the plurality of devices includes:
Optionally, the apparatus further includes:
Optionally, the allocating devices participating in collaborative sensing to the sensing target from among the determined devices participating in collaborative sensing includes:
Optionally, the sensing sub-area includes: a network side device sensing sub-area and a terminal sensing sub-areas, at least one network side device is allocated to one network side device sensing sub-area, at least one terminal is allocated to one terminal sensing sub-area, and one network side device sensing sub-area covers at least one terminal sensing sub-area; and
Optionally, in a case that a same terminal exists in terminals allocated to two terminal sensing sub-areas, the same terminal participates in collaborative sensing in the two terminal sensing sub-areas in a time division multiplexing manner, or a frequency division multiplexing manner, or a code division multiplexing manner.
Optionally, the apparatus further includes:
Optionally, the updating devices participating in collaborative sensing for the sensing target includes at least one of:
Optionally, the first condition includes at least one of:
Optionally, trajectory sensing of the sensing target is initiated by the third device based on a sensing requirement; and/or
The target positioning sensing apparatus described above can improve positioning capabilities of the communication system.
The target positioning sensing apparatus in this embodiment of this application may be an electronic device, for example, the foregoing third device, such as an electronic device with an operating system, or a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a network side device, a core network device, a sensing device, or another device other than a terminal. For example, the network side device may include, but is not limited to, the types of network side devices listed in embodiments of this application, and the another device may be a server, a network attached storage NAS, or the like. This is not specifically limited in this embodiment of this application.
The target positioning sensing apparatus provided in this embodiment of this application can implement the processes implemented in the method embodiment shown in
Optionally, as shown in
An embodiment of this application further provides a communication device, including a processor and a communication interface, where the communication interface is configured to perform sensing measurement on a sensing target to obtain a first measurement quantity result of a first measurement quantity of a dynamic reflection path of a first signal, where the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and send the first measurement quantity result, where the first measurement quantity result is used for determining a positioning sensing result of the sensing target. This communication device embodiment corresponds to the foregoing first device-side method embodiment. Each implementation process and implementation of the foregoing method embodiment can be applied to this terminal embodiment, and can achieve the same technical effects.
Specifically,
The communication device 1200 includes, but is not limited to, at least some components such as a radio frequency unit 1201, a network module 1202, an audio output unit 1203, an input unit 1204, a sensor 1205, a display unit 1206, a user input unit 1207, an interface unit 1208, a memory 1209, and a processor 1210.
A person skilled in the art may understand that the terminal 1200 may further include the power supply (for example, a battery) for supplying power to the components. The power supply may be logically connected to the processor 1210 by using a power management system, thereby implementing functions such as charging, discharging, and power consumption management by using the power management system. A terminal structure shown in
It should be understood that in this embodiment of this application, the input unit 1204 may include a graphics processing unit (GPU) 12041 and a microphone 12042, and the graphics processing unit 12041 processes image data of still images or videos obtained by an image capture apparatus (for example, a camera) in a video capture mode or an image capture mode. The display unit 1206 may include a display panel 12061, and the display panel 12061 may be configured in a form such as a liquid crystal display or an organic light-emitting diode. The user input unit 1207 includes at least one of a touch panel 12071 and another input device 12072. The touch panel 12071 is also referred to as a touchscreen. The touch panel 12071 may include two parts: a touch detection apparatus and a touch controller. The another input device 12072 may include, but is not limited to, a physical keyboard, a functional key (for example, a volume control key or a switch key), a track ball, a mouse, and a joystick, which are not repeated herein.
In this embodiment of this application, the radio frequency unit 1201 receives downlink data from a network side device and then transmits the data to the processor 1210 for processing. In addition, the radio frequency unit 1201 may send uplink data to the network side device. Generally, the radio frequency unit 1201 includes, but is not limited to, an antenna, an amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 1209 may be configured to store software programs or instructions, and various pieces of data. The memory 1209 may mainly include a first storage area storing a program or instructions and a second storage area storing data. The first storage area may store an operating system, an application program or instruction required by at least one function (for example, a sound playing function and an image playing function), and the like. In addition, the memory 1209 may include a volatile memory or a non-volatile memory, or the memory 1209 may include both a volatile and a non-volatile memory. The non-volatile memory may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM) or a flash memory. The volatile memory may be a random access memory (RAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), a synchronous dynamic random access memory (SDRAM), a double data rate synchronous dynamic random access memory (DDR SDRAM), an enhanced synchronous dynamic random access memory (ESDRAM), a synchlink dynamic random access memory (SLDRAM), and a direct rambus random access memory (DR RAM). The memory 1209 in this embodiment of this application includes, but is not limited to, such memories and any other suitable types of memories.
The processor 1210 may include one or more processing units. Optionally, the processor 1210 integrates an application processor and a modem processor. The application processor mainly processes operations related to an operating system, a user interface, an application program, and the like. The modem processor mainly processes wireless communication signals, and may be, for example, a baseband processor. It may be understood that the modem processor may alternatively not be integrated into the processor 1210.
The radio frequency unit 1201 is configured to perform sensing measurement on a sensing target to obtain a first measurement quantity result of a first measurement quantity of a dynamic reflection path of a first signal, where the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and send the first measurement quantity result, where the first measurement quantity result is used for determining a positioning sensing result of the sensing target.
Optionally, the first device includes:
The second device includes:
Optionally, the positioning sensing result includes at least one of:
Optionally, the first signal includes one of:
Optionally, the sending the first measurement quantity result includes:
Optionally, the sending the first measurement quantity result includes:
Optionally, the radio frequency unit 1201 is further configured to:
Optionally, the performing APS measurement on a wireless channel in which the sensing target is located, to obtain a second measurement quantity result of a second measurement quantity includes at least one of:
Optionally, the radio frequency unit 1201 is further configured to:
Optionally, the sending the second measurement quantity result includes:
Optionally, the second measurement quantity includes one of:
Optionally, the radio frequency unit 1201 is further configured to:
Optionally, the third device includes one of:
Optionally, the radio frequency unit 1201 is further configured to:
Optionally, the parameter configuration information includes at least one of:
Optionally, the radio frequency unit 1201 is further configured to:
Optionally, the device information includes at least one of:
Optionally, the status information includes at least one of:
Optionally, the reporting device information of the first device to a third device includes at least one of:
The communication device described above can improve positioning capabilities of the communication system.
An embodiment of this application further provides a communication device. The communication device is a third device, and includes a processor and a communication interface, where the communication interface is configured to receive first measurement quantity results sent by at least two first devices, where the first measurement quantity results are results of a first measurement quantity of a dynamic reflection path of a first signal that are obtained by the first devices by performing sensing measurement on a sensing target, the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and the processor or the communication interface is configured to determine a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices. The communication device embodiment corresponds to the foregoing third device method embodiment. Each implementation process and implementation of the foregoing method embodiment can be applied to this network side device embodiment, and can achieve the same technical effects.
Specifically, an embodiment of this application further provides a communication device, which is a third device. In this embodiment, an example in which the communication device is a core network device is used for description. As shown in
Specifically, the communication device 1300 in this embodiment of the present invention further includes: instructions or a program stored in the memory 1303 and executable on the processor 1301. The processor 1301 invokes the instructions or the program in the memory 1303 to perform the method executed by each module shown in
The network interface 1302 is configured to receive first measurement quantity results sent by at least two first devices, where the first measurement quantity results are results of a first measurement quantity of a dynamic reflection path of a first signal that are obtained by the first devices by performing sensing measurement on a sensing target, the first measurement quantity includes: at least one of a reflection path Doppler frequency or a reflection path length change speed, and the first signal is a signal sent by a second device to the first device; and
Optionally, the first device includes:
The second device includes:
Optionally, the positioning sensing result includes at least one of:
Optionally, the first signal includes one of:
Optionally, the receiving first measurement quantity results sent by at least two first devices includes:
The determining a positioning sensing result of the sensing target based on the first measurement quantity results sent by the at least two first devices includes:
Optionally, the receiving first measurement quantity results sent by at least two first devices includes:
Optionally, the network interface 1302 is further configured to:
Optionally, the initial location of the sensing target includes:
Optionally, the initial location of the sensing target determined based on the device-free technology includes:
Optionally, the initial location of the sensing target determined based on the angle information of the sensing target is determined in the following manner:
Optionally, the calculating, by the third device, a confidence of each candidate location in a plurality of candidate locations in an initial location search area includes:
Optionally, the target information further includes:
Optionally, the weight corresponding to each first device is determined for the third device based on device information of the first device, where
Optionally, the status information includes at least one of:
Optionally, the second measurement quantity is a suppressed second measurement quantity obtained by the first device by suppressing interference energy other than a dynamic reflection path spectrum peak in the second measurement quantity.
Optionally, the receiving second measurement quantity results sent by at least two first devices includes:
The determining a current location of the sensing target according to the initial location, the first measurement quantity results, and the second measurement quantity results includes:
Optionally, the second measurement quantity includes at least one of:
Optionally, the network interface 1302 is further configured to:
Optionally, the network interface 1302 is further configured to:
Optionally, the parameter configuration information includes at least one of:
Optionally, the network interface 1302 is further configured to:
Optionally, the device information includes at least one of:
Optionally, the status information includes at least one of:
Optionally, the determining devices participating in collaborative sensing in the plurality of devices according to the device information sent by the plurality of devices includes:
Optionally, the processor 1301 is further configured to:
Optionally, the allocating devices participating in collaborative sensing to the sensing target from among the determined devices participating in collaborative sensing includes:
Optionally, the sensing sub-area includes: a network side device sensing sub-area and a terminal sensing sub-areas, at least one network side device is allocated to one network side device sensing sub-area, at least one terminal is allocated to one terminal sensing sub-area, and one network side device sensing sub-area covers at least one terminal sensing sub-area; and
Optionally, in a case that a same terminal exists in terminals allocated to two terminal sensing sub-areas, the same terminal participates in collaborative sensing in the two terminal sensing sub-areas in a time division multiplexing manner, or a frequency division multiplexing manner, or a code division multiplexing manner.
Optionally, the processor 1301 is further configured to:
Optionally, the updating devices participating in collaborative sensing for the sensing target includes at least one of:
Optionally, the first condition includes at least one of:
Optionally, trajectory sensing of the sensing target is initiated by the third device based on a sensing requirement; and/or
The communication device described above can improve positioning capabilities of the communication system.
An embodiment of this application further provides a readable storage medium. The readable storage medium stores a program or instructions, where the program or the instructions, when executed by a processor, implement the processes of embodiments of the foregoing transmission determining method, and achieve the same technical effects. To avoid repetition, details are not described herein again.
The processor is the processor in the terminal described in the foregoing embodiment. The readable storage medium includes a computer-readable storage medium such as a computer read only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
An embodiment of this application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or instructions, to implement the processes of embodiments of the foregoing target positioning sensing method, and achieve the same technical effects. To avoid repetition, details are not described herein again.
It should be understood that, the chip described in this embodiment of this application may also be referred to as a system-level chip, a system chip, a chip system, a system on chip, or the like.
An embodiment of this application further provides a computer program product, stored in a storage medium, where the computer program product is executed by at least one processor to implement the processes of embodiments of the target positioning sensing method, and can achieve the same technical effects. To avoid repetition, details are not described herein again.
An embodiment of this application further provides a transmission determining system, including: a first device, a second device, and a third device. The first device may be configured to perform the steps of the target positioning sensing method on the first device side as described above, and the third device may be configured to perform the steps of the target positioning sensing method on the third device side as described above.
It should be noted that the term “include”, “include” or any other variation thereof in this specification is intended to cover a non-exclusive inclusion, which specifies the presence of stated processes, methods, objects, or apparatuses, but does not preclude the presence or addition of one or more other processes, methods, objects, or apparatuses. Without more limitations, elements defined by the sentence “including one” does not exclude that there are still other same elements in the processes, methods, objects, or apparatuses. Further, it should be noted that the scope of the method and the apparatus in the implementations of this application is not limited to performing the functions in the order shown or discussed, but may further include performing the functions in a substantially simultaneous manner or in the reverse order depending on the functions involved. For example, the described method may be performed in an order different from that described, and various steps may be further added, omitted, or combined. In addition, features described with reference to some examples may be combined in other examples.
Through the descriptions of the foregoing implementations, a person skilled in the art may clearly understand that the methods in the foregoing embodiments may be implemented via software and a necessary general hardware platform, and certainly, may also be implemented by hardware, but in many cases, the former manner is a better implementation. Based on such an understanding, the technical solutions of this application essentially or the part contributing to the prior art may be implemented in a form of a computer software product. The computer software product is stored in a storage medium (for example, a ROM/RAM, a magnetic disk, or an optical disc) and includes several instructions for instructing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, a network device, or the like) to perform the method described in embodiments of this application.
Embodiments of this application are described above with reference to the accompanying drawings. However, this application is not limited to the foregoing specific implementations. The foregoing specific implementations are illustrative instead of limitative. Enlightened by this application, a person of ordinary skill in the art can make many forms without departing from the idea of this application and the scope of protection of the claims. All of the forms fall within the protection of this application.
Number | Date | Country | Kind |
---|---|---|---|
202111600044.4 | Dec 2021 | CN | national |
This application is a continuation of International Application No. PCT/CN2022/140653 filed on Dec. 21, 2022, which claims priority to Chinese Patent Application No. 202111600044.4 filed on Dec. 24, 2021, which are incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2022/140653 | Dec 2022 | WO |
Child | 18752275 | US |