The embodiments relate to the field of video processing, and in particular, to a method and an apparatus for assessing quality of a VR video.
A virtual reality (VR) technology is a cutting-edge technology that combines a plurality of fields (including computer graphics, a man-machine interaction technology, a sensor technology, a man-machine interface technology, an artificial intelligence technology, and the like) and in which appropriate equipment is used to deceive human senses (for example, senses of three-dimensional vision, hearing, and smell) to create, experience, and interact with a world detached from reality. Briefly, the VR technology is a technology in which a computer is used to create a false world and create immersive and interactive audio-visual experience. With increasing popularity of VR services, VR industry ecology emerges. An operator, an industry partner, and an ordinary consumer all need a VR service quality assessment method to evaluate user experience. User experience is evaluated mainly by assessing quality of a VR video, to drive transformation of the VR service from available to user-friendly and facilitate development of the VR industry.
In the conventional technology, quality of a video is assessed by using a bit rate, resolution, and a frame rate of the video. This is a method for assessing quality of a conventional video. However, the VR video greatly differs from the conventional video. The VR video is a 360-degree panoramic video, and the VR video is encoded in a unique manner. If the quality of the VR video is assessed by using the method for assessing quality of the conventional video, an assessment result is of low accuracy.
Embodiments provide a method and an apparatus for assessing quality of a VR video. In the embodiments, accuracy of an assessment result of a VR video is improved.
According to a first aspect, an embodiment provides a method for assessing quality of a VR video, including:
obtaining a bit rate, a frame rate, resolution, and temporal perceptual information (TI) of a VR video, where the TI of the VR video is used to represent a time variation of a video sequence of the VR video; and determining a mean opinion score (MOS) of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video, where the MOS of the VR video is used to represent quality of the VR video. In comparison with the conventional technology, the TI is introduced as a parameter for assessing the quality of the VR video, and therefore accuracy of a quality assessment result of the VR video is improved.
In another embodiment, the obtaining TI of a VR video includes:
obtaining a difference between pixel values at a same location in two adjacent frames of images of the VR video; and calculating the difference between the pixel values at the same location in the two adjacent frames of images based on standard deviation formulas, to obtain the TI of the VR video.
The standard deviation formulas are
where
Pij represents a difference between a pixel value of a jth pixel in an ith row of a current frame in the two adjacent frames of images and a pixel value of a jth pixel in an ith row of a previous frame of the current frame, and
W and H respectively represent a width and a height of each of the two adjacent frames of images.
In a another embodiment, the obtaining TI of a VR video includes:
obtaining a head rotation angle Δa of a user within preset duration Δt; determining an average head rotation angle of the user based on the preset duration Δt and the head rotation angle Δa of the user; and determining the TI of the VR video based on the average head rotation angle of the user. A larger average head rotation angle of the user indicates a larger TI value of the VR video.
In a another embodiment, the obtaining a head rotation angle Δa of a user within preset duration Δt includes:
obtaining a head angle γt of the user at a time point t and a head angle γt+Δt of the user at a time point t+Δt; and determining the head rotation angle Δa of the user according to the following method: When an absolute value of a difference between γt+Δt and γt is greater than 180 degrees, and γt is less than γt+Δt,
Δa=180−abs(γt)+180−abs(γt+Δt);
when the absolute value of the difference between γt+Δt and γt is greater than 180 degrees, and γt is greater than γt+Δt,
Δa=(180−abs(γt)+180−abs(γt+Δt))−1; and
when the absolute value of the difference between γt+Δt and γt is not greater than 180 degrees, Δa=γt+Δt−γt.
In a another embodiment, the determining the TI of the VR video based on the average head rotation angle of the user includes:
inputting the average head rotation angle of the user into a first TI prediction model for calculation, to obtain the TI of the VR video. The first TI prediction model is TI=log(m*angleVelocity)+n, where angleVelocity represents the average head rotation angle of the user, and m and n are constants. The TI of the VR video is predicted based on the head rotation angle of the user, so that computing power required for calculating the TI can be ignored.
In a another embodiment, the determining the TI of the VR video based on the average head rotation angle of the user includes:
inputting the average head rotation angle of the user into a second TI prediction model for calculation, to obtain the TI of the VR video. The second TI prediction model is a nonparametric model. The TI of the VR video is predicted based on the head rotation angle of the user, so that computing power required for calculating the TI can be ignored.
In a another embodiment, the determining a mean opinion score MOS of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video includes:
inputting the bit rate, the resolution, the frame rate, and the TI of the VR video into a quality assessment model for calculation, to obtain the MOS of the VR video. The quality assessment model is as follows:
MOS=5−a*log(max(log(B1),0.01))−b*log(max(log(B2),0.01))−c*log(max(log F,0.01))−d*log(max(log(TI),0.01)),
where B1 represents the bit rate of the VR video, B2 represents the resolution of the VR video, F represents the frame rate of the VR video, and a, b, c, and d are constants.
According to a second aspect, an embodiment provides an assessment apparatus, including:
an obtaining unit, configured to obtain a bit rate, a frame rate, resolution, and temporal perceptual information TI of a VR video, where the TI of the VR video is used to represent a time variation of a video sequence of the VR video; and
a determining unit, configured to determine a mean opinion score MOS of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video, where the MOS of the VR video is used to represent quality of the VR video.
In a another embodiment, when obtaining the TI of the VR video, the obtaining unit is configured to:
obtain a difference between pixel values at a same location in two adjacent frames of images of the VR video; and calculate the difference between the pixel values at the same location in the two adjacent frames of images based on standard deviation formulas, to obtain the TI of the VR video.
The standard deviation formulas are
where
Pij represents a difference between a pixel value of a jth pixel in an ith row of a current frame in the two adjacent frames of images and a pixel value of a jth pixel in an ith row of a previous frame of the current frame, and
W and H respectively represent a width and a height of each of the two adjacent frames of images.
In a another embodiment, when obtaining the TI of the VR video, the obtaining unit is configured to:
obtain a head rotation angle Δa of a user within preset duration Δt; determine an average head rotation angle of the user based on the preset duration Δt and the head rotation angle Δa of the user; and determine the TI of the VR video based on the average head rotation angle of the user. A larger average head rotation angle of the user indicates a larger TI value of the VR video.
In a another embodiment, when obtaining the head rotation angle Δa of the user within the preset duration Δt, the obtaining unit is configured to:
obtain a head angle γt of the user at a time point t and a head angle γt+Δt of the user at a time point t+Δt; and determine the head rotation angle Δa of the user according to the following method: When an absolute value of a difference between γt+Δt and γt is greater than 180 degrees, and γt is less than γt+Δt,
Δa=180−abs(γt)+180−abs(γt+Δt);
when the absolute value of the difference between γt+Δt and γt is greater than 180 degrees, and γt is greater than γt+Δt,
Δa=(180−abs(γt)+180−abs(γt+Δt))−1; and
when the absolute value of the difference between γt+Δt and γt is not greater than 180 degrees, Δa=γt+Δt−γt.
In a another embodiment, when determining the TI of the VR video based on the average head rotation angle of the user, the obtaining unit is configured to:
input the average head rotation angle of the user into a first TI prediction model for calculation, to obtain the TI of the VR video. The first TI prediction model is TI=log(m*angleVelocity)+n, where angleVelocity represents the average head rotation angle of the user, and m and n are constants.
In a another embodiment, when determining the TI of the VR video based on the average head rotation angle of the user, the obtaining unit is configured to:
input the average head rotation angle of the user into a second TI prediction model for calculation, to obtain the TI of the VR video. The second TI prediction model is a nonparametric model.
In a another embodiment, when determining the mean opinion score MOS of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video, the determining unit is configured to:
input the bit rate, the resolution, the frame rate, and the TI of the VR video into a quality assessment model for calculation, to obtain the MOS of the VR video. The quality assessment model is as follows:
MOS=5−a*log(max(log(B1),0.01))−b*log(max(log(B2),0.01))−c*log(max(log F,0.01))−d*log(max(log(TI),0.01)), where
B1 represents the bit rate of the VR video, B2 represents the resolution of the VR video, F represents the frame rate of the VR video, and a, b, c, and d are constants.
According to a third aspect, an embodiment provides an assessment apparatus, including:
a memory that stores executable program code; and a processor coupled to the memory. The processor invokes the executable program code stored in the memory to perform some or all of the steps in the method according to the first aspect.
According to a fourth aspect, an embodiment provides a computer-readable storage medium. The computer storage medium stores a computer program, the computer program includes program instructions, and when the program instructions are executed by a processor, the processor is enabled to perform some or all of the steps in the method according to the first aspect.
It may be understood that in the solutions of the embodiments, the bit rate, the frame rate, the resolution, and the TI of the VR video are obtained, and the mean opinion score MOS of the VR video is determined based on the bit rate, the frame rate, the resolution, and the TI of the VR video, where the MOS of the VR video is used to represent quality of the VR video. In the embodiments, accuracy of an assessment result of the VR video can be improved.
These aspects or other aspects are clearer and more comprehensible in description of the following embodiments.
To describe the solutions in the embodiments or in the conventional technology more clearly, the following briefly describes the accompanying drawings for describing the embodiments or the conventional technology. It is clear that the accompanying drawings in the following description show merely some embodiments, and a person of ordinary skill in the art may derive other drawings from these accompanying drawings without creative efforts.
The video server 101 is a server that provides a video service, for example, an operator.
The intermediate network device 102 is a device for implementing video transmission between the video server 101 and the terminal device 103, for example, a home gateway. The home gateway not only functions as a hub for connecting the inside and outside, but also serves as a most important control center in an entire home network. The home gateway provides a high-speed access interface on a network side for accessing a wide area network. The home gateway provides an Ethernet interface and/or a wireless local area network function on a user side for connecting various service terminals in a home, for example, a personal computer and an IP set-top box.
The terminal device 103 is also referred to as user equipment (UE), and is a device that provides voice and/or data connectivity for a user, for example, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a mobile internet device (MID), or a wearable device such as a head-mounted device.
In the present invention, any one of the video server 101, the intermediate network device 102, and the terminal device 103 may perform a method for assessing quality of a VR video according to the present invention.
S201. An assessment apparatus obtains a bit rate, resolution, a frame rate, and TI of a VR video.
The bit rate of the VR video is a rate at which a bitstream of the VR video is transmitted per unit of time, the resolution of the VR video is resolution of each frame of image of the VR video, and the frame rate of the VR video is a quantity of frames of refreshed images per unit of time. The TI of the VR video is used to indicate a time variation of a video sequence of the VR video. A larger time variation of a video sequence indicates a larger TI value of the video sequence. A video sequence with a relatively high degree of motion usually has a relatively large time variation, and therefore the video sequence usually has a relatively large TI value.
In a another embodiment, the assessment apparatus calculates the bit rate of the VR video by obtaining load of the bitstream of the VR video in a period of time. The assessment apparatus parses the bitstream of the VR video to obtain a sequence parameter set (SPS) and a picture parameter set (PPS) of the VR video, and then determines the resolution and the frame rate of the VR video based on syntax elements in the SPS and the PPS.
In a another embodiment, that an assessment apparatus obtains TI of a VR video includes:
determining the TI of the VR video in a manner in ITU-R BT.1788, that is, determining the TI of the VR video based on pixel values of two adjacent frames of images of the VR video; or
determining the TI of the VR video based on head rotation angle information of a user.
For example, that the assessment apparatus determines the TI of the VR video based on pixel values of two adjacent frames of images of the VR video includes:
The assessment apparatus obtains a difference between pixel values of pixels at a same location in the two adjacent frames of images; and calculates the difference between the pixel values of the pixels at the same location in the two adjacent frames of images based on standard deviation formulas, to obtain the TI of the VR video.
The standard deviation formulas are
where
Pij represents a difference between a pixel value of a jth pixel in an ith row of a current frame in the two adjacent frames of images and a pixel value of a jth pixel in an ith row of a previous frame of the current frame, and
W and H respectively represent a width and a height of each of the two adjacent frames of images. In other words, W*H is resolution of each of the two adjacent frames of images.
In an example, if the assessment apparatus determines the TI of the VR video based on pixel values of pixels in N consecutive frames of images of the VR video, the assessment apparatus obtains N−1 pieces of candidate TI based on related description of the process of determining the TI of the VR video based on pixel values of two adjacent frames of images, and then determines an average value of the N−1 pieces of candidate TI as the TI of the VR video, where N is an integer greater than 2.
In a another embodiment, that the assessment apparatus determines the TI of the VR video based on head rotation angle information of a user includes:
obtaining a head rotation angle Δa of the user within preset duration Δt;
determining an average head rotation angle of the user based on the preset duration Δt and the head rotation angle Δa of the user; and
determining the TI of the VR video based on the average head rotation angle of the user.
For example, that the assessment apparatus obtains a head rotation angle Δa of the user within preset duration Δt includes:
obtaining a head angle γt of the user at a time point t and a head angle γt+Δt of the user at a time point t+Δt and determining the head rotation angle Δa of the user according to the following method:
When an absolute value of a difference between γt+Δt and γt is greater than 180 degrees, and γt is less than γt+Δt,
Δa=180−abs(γt)+180−abs(γt+Δt);
when the absolute value of the difference between γt+Δt and γt is greater than 180 degrees, and γt is greater than γt+Δt,
Δa=(180−abs(γt)+180−abs(γt+Δt))−1; and
when the absolute value of the difference between γt+Δt and γt is not greater than 180 degrees, Δa=γt+Δt−γt.
The assessment apparatus then determines the average head rotation angle angleVelocity of the user based on the preset duration Δt and the head rotation angle Δa of the user, where angleVelocity=Δa/Δt.
It should be noted that the preset duration may be duration of playing a frame of image of the VR video.
In a possible embodiment, that the assessment apparatus determines the TI of the VR video based on the average head rotation angle of the user includes:
The assessment apparatus inputs angleVelocity into a first TI prediction model for calculation, to obtain the TI of the VR video.
It should be noted that a larger value of angleVelocity indicates a larger TI value of the VR video.
Optionally, the TI prediction model is TI=log(m*angleVelocity)+n, where m and n are constants.
Optionally, m and n may be empirically set, and value ranges of m and n may be [−100, 100]. Further, the value ranges of m and n may be [−50, 50].
Optionally, m and n may alternatively be obtained through training, and m and n obtained through training are usually values in a range [−100, 100]. A process of obtaining m and n through training is a process of obtaining the TI prediction model through training.
In a another embodiment, before angleVelocity is input into the first TI prediction model for calculation, the assessment apparatus obtains a first training data set that includes a plurality of data items to train a first parametric model, to obtain the first TI prediction model. Each first data item in the first training data set includes an average head rotation angle and TI. The average head rotation angle is input data of the first parametric model, and the TI is output data of the first parametric model.
It should be noted that the first parametric model is a model described by using an algebraic equation, a differential equation, a differential equation system, a transfer function, and the like. Establishing the first parametric model is determining parameters in a known model structure, for example, m and n in the TI prediction model.
In an example, the assessment apparatus may train a training parametric model by using a training data set, to obtain a parameter in the model, for example, m and n in the TI prediction model.
In a another embodiment, that the assessment apparatus determines the TI of the VR video based on the average head rotation angle of the user includes:
inputting angleVelocity into a second TI prediction model for calculation, to obtain the TI of the VR video. The second TI prediction model is a nonparametric model.
Herein, it should be noted that in the nonparametric model, no strong assumptions are made about a form of an objective function. By making no assumptions, the objective function can be freely in any function form through learning from training data. A training step of the nonparametric model is similar to a training manner of a parametric model. A large quantity of training data sets need to be prepared to train the model. However, in the nonparametric model, no assumptions need to be made about the form of the objective function, which is different from a case, in the parametric model, in which an objective function needs to be determined. For example, a k-nearest neighbor (KNN) algorithm may be used.
It should be noted that the assessment apparatus in the present invention is connected to a head-mounted device (HMD) of the user in a wired or wireless manner, so that the assessment apparatus can obtain the head angle information of the user.
S202. The assessment apparatus determines a MOS of the VR video based on the bit rate, the resolution, the frame rate, and the TI of the VR video.
The MOS of the VR video is used to represent quality of the VR video and is an evaluation criterion for measuring video quality. A scoring criterion comes from ITU-T P.910. Video quality is classified into five levels: excellent, good, fair, poor, and very poor, and corresponding MOSs are 5, 4, 3, 2, and 1 respectively.
For example, the assessment apparatus inputs the bit rate, the resolution, the frame rate, and the TI of the VR video into a quality assessment model for calculation, to obtain the MOS of the VR video.
Optionally, the quality assessment model may be as follows:
MOS=5−a*log(max(log(B1),0.01))−b*log(max(log(B2),0.01))−c*log(max(log F,0.01))−d*log(max(log(TI),0.01)), where
B1 represents the bit rate of the VR video, B2 represents the resolution of the VR video, F represents the frame rate of the VR video, and a, b, c, and d are constants.
Optionally, a, b, c, and d may be empirically set, and value ranges of a, b, c, and d may be [−100, 100]. Further, the value ranges of a, b, c, and d may be [−50, 50].
Optionally, a, b, c, and d may alternatively be obtained through training, and a, b, c, and d obtained through training are usually values in a range [−100, 100]. A process of obtaining a, b, c, and d through training is a process of obtaining the quality assessment model through training.
It should be noted that a higher bit rate of the VR video indicates a larger MOS value of the VR video, that is, indicates higher quality of the VR video. Higher resolution of the VR video indicates a larger MOS value of the VR video. A higher frame rate of the VR video indicates a larger MOS value of the VR video. A larger TI value of the VR video indicates a larger MOS value of the VR video.
In a another embodiment, before the bit rate, the resolution, the frame rate, and the TI of the VR video are input into the quality assessment model for calculation, the assessment apparatus obtains a third training data set that includes a plurality of data items to train a second parametric model, to obtain the quality assessment model. Each data item in the third training data set includes information about a VR video and a MOS. The information about the VR video is input data of the second parametric model, and MOS is output data of the second parametric model. The information about the VR video includes a bit rate, resolution, and a frame rate of the VR video.
It should be noted that the second parametric model is a model described by using an algebraic equation, a differential equation, a differential equation system, a transfer function, and the like. Establishing the second parametric model is determining parameters in a known model structure, for example, a, b, c, and d in the quality assessment model.
It may be understood that in the solution of this embodiment, the assessment apparatus introduces the TI of the VR video to assess the quality of the VR video. In comparison with the conventional technology, accuracy of quality assessment of the VR video is significantly improved.
an obtaining unit 301, configured to obtain a bit rate, a frame rate, resolution, and temporal perceptual information TI of a VR video, where the TI of the VR video is used to represent a time variation of a video sequence of the VR video; and
a determining unit 302, configured to determine a mean opinion score MOS of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video, where the MOS of the VR video is used to represent quality of the VR video.
In a another embodiment, when obtaining the TI of the VR video, the obtaining unit 301 is configured to:
obtain a difference between pixel values at a same location in two adjacent frames of images of the VR video; and calculate the difference between the pixel values at the same location in the two adjacent frames of images based on standard deviation formulas, to obtain the TI of the VR video.
The standard deviation formulas are
where
Pij represents a difference between a pixel value of a jth pixel in an ith row of a current frame in the two adjacent frames of images and a pixel value of a jth pixel in an ith row of a previous frame of the current frame, and
W and H respectively represent a width and a height of each of the two adjacent frames of images.
In a another embodiment, when obtaining the TI of the VR video, the obtaining unit 301 is configured to:
obtain a head rotation angle Δa of a user within preset duration Δt; determine an average head rotation angle of the user based on the preset duration Δt and the head rotation angle Δa of the user; and determine the TI of the VR video based on the average head rotation angle of the user. A larger average head rotation angle of the user indicates a larger TI value of the VR video.
In a another embodiment, when obtaining the head rotation angle Δa of the user within the preset duration Δt, the obtaining unit 301 is configured to:
obtain a head angle γt of the user at a time point t and a head angle γt+Δt of the user at a time point t+Δt; and determine the head rotation angle Δa of the user according to the following method: When an absolute value of a difference between γt+Δt and γt is greater than 180 degrees, and γt is less than γt+Δt,
Δa=180−abs(γt)+180−abs(γt+Δt);
when the absolute value of the difference between γt+Δt and γt is greater than 180 degrees, and γt is greater than γt+Δt,
Δa=(180−abs(γt)+180−abs(γt+Δt))−1; and
when the absolute value of the difference between γt+Δt and γt is not greater than 180 degrees, Δa=γt+Δt Y t.
In a another embodiment, when determining the TI of the VR video based on the average head rotation angle of the user, the obtaining unit 301 is configured to:
input the average head rotation angle of the user into a first TI prediction model for calculation, to obtain the TI of the VR video. The first TI prediction model is TI=log(m*angleVelocity)+n, where angleVelocity represents the average head rotation angle of the user, and m and n are constants.
In a another embodiment, when determining the TI of the VR video based on the average head rotation angle of the user, the obtaining unit 301 is configured to:
input the average head rotation angle of the user into a second TI prediction model for calculation, to obtain the TI of the VR video. The second TI prediction model is a nonparametric model.
In a another embodiment, when determining the mean opinion score MOS of the VR video based on the bit rate, the frame rate, the resolution, and the TI of the VR video, the determining unit 302 is configured to:
input the bit rate, the resolution, the frame rate, and the TI of the VR video into a quality assessment model for calculation, to obtain the MOS of the VR video. The quality assessment model is as follows:
MOS=5−a*log(max(log(B1),0.01))−b*log(max(log(B2),0.01))−c*log(max(log F,0.01))−d*log(max(log(TI),0.01)), where
B1 represents the bit rate of the VR video, B2 represents the resolution of the VR video, F represents the frame rate of the VR video, and a, b, c, and d are constants.
It should be noted that the units (the obtaining unit 301 and the determining unit 302) are configured to perform related steps of the foregoing method. The obtaining unit 301 is configured to perform related content of step S201, and the determining unit 302 is configured to perform related content of step S202.
In this embodiment, the assessment apparatus 300 is presented in a form of a unit. The “unit” herein may be an application-specific integrated circuit (ASIC), a processor or a memory that executes one or more software or firmware programs, an integrated logic circuit, and/or another device that can provide the foregoing function. In addition, the obtaining unit 301 and the determining unit 302 may be implemented by using a processor 401 of an assessment apparatus shown in
As shown in
The processor 401 may be a general-purpose central processing unit (CPU), a microprocessor, an ASIC, or one or more integrated circuits for controlling program execution of the foregoing solutions.
The communications interface 403 is configured to communicate with another device or a communications network, for example, an Ethernet, a radio access network (RAN), or a wireless local area network (WLAN).
The memory 402 may be a read-only memory (ROM) or another type of static storage device that can store static information and an instruction, or a random access memory (RAM) or another type of dynamic storage device that can store information and an instruction, or may be an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other compact disc storage, optical disc storage (including a compressed optical disc, a laser disc, an optical disc, a digital versatile disc, a Blu-ray optical disc, and the like), a magnetic disk storage medium or another magnetic storage device, or any other medium that can be used to carry or store expected program code in a form of an instruction or a data structure and that can be accessed by a computer. However, this is not limited thereto. The memory may exist independently and is connected to the processor by using the bus. The memory may be alternatively integrated with the processor.
The memory 402 is configured to store application program code for executing the foregoing solutions, and the processor 401 controls the execution. The processor 401 is configured to execute the application program code stored in the memory 402.
The code stored in the memory 402 may be used to perform related content of the method that is for assessing quality of a VR video and that is disclosed in the embodiment shown in
The embodiments further provide a computer storage medium. The computer storage medium may store a program, and when the program is executed, at least a part or all of the steps of any method for assessing quality of a VR video recorded in the foregoing method embodiments may be performed.
It should be noted that, to make the description brief, the foregoing method embodiments are expressed as a series of actions. However, a person of ordinary skill in the art should appreciate that the embodiments are not limited to the described action sequence, because according to the embodiments, some steps may be performed in other sequences or performed simultaneously. In addition, a person of ordinary skill in the art should also appreciate that all the embodiments are example embodiments, and the related actions and modules are not necessarily mandatory to all or other embodiments.
In the foregoing embodiments, the description of each embodiment has respective focuses. For a part that is not described in detail in an embodiment, refer to related descriptions in other embodiments.
In the several embodiments provided, it should be understood that the disclosed apparatus may be implemented in another manner. For example, the described apparatus embodiment is merely an example. For example, the unit division is merely logical function division and may be another division during actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of the embodiments.
In addition, functional units in the embodiments may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software functional unit.
When the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, the integrated unit may be stored in a computer-readable memory. Based on such an understanding, the solutions essentially, or the part contributing to the conventional technology, or all or some of the solutions may be implemented in the form of a software product. The software product is stored in a memory and includes several instructions for instructing a computer device (which may be a personal computer, a server, or a network device) to perform all or some of the steps of the methods described in the embodiments. The foregoing memory includes: any medium that can store program code, such as a USB flash drive, a ROM, a RAM, a removable hard disk, a magnetic disk, or an optical disc.
A person of ordinary skill in the art may understand that all or some of the steps of the methods in the embodiments may be implemented by a program instructing related hardware. The program may be stored in a computer-readable memory. The memory may include a flash memory, a ROM, a RAM, a magnetic disk, an optical disc, or the like.
The embodiments are described in detail above. The principle and implementation are described herein through specific examples. The description about the embodiments is merely provided to help understand the method and core ideas. In addition, a person of ordinary skill in the art can make variations and modifications to the embodiments in terms of the specific implementations and scopes according to the ideas. Therefore, the content of embodiments shall not be construed as limiting.
Number | Date | Country | Kind |
---|---|---|---|
201910416533.0 | May 2019 | CN | national |
This application is a continuation of International Application No. PCT/CN2020/090724, filed on May 17, 2020, which claims priority to Chinese Patent Application No. 201910416533.0, filed on May 17, 2019. The disclosures of the aforementioned applications are hereby incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/090724 | May 2020 | US |
Child | 17527604 | US |