The present disclosure relates to a video quality estimation apparatus, a video quality estimation method, and a video quality estimation system.
In recent years, the demand for video distribution services has been increasing.
However, video traffic consumes a lot of bands. This makes reducing video traffic a critical issue for network operations.
Therefore, techniques to reduce video traffic have been proposed recently. For example, Patent Literature 1 discloses a technique for estimating a throughput of downloading videos and then selecting a bit rate at which the traffic volume becomes the lowest based on the estimated throughput.
Other techniques to reduce video traffic include traffic shaping. The traffic shaping is a band control technique that distributes videos at a constant bit rate (shaping rate).
Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2019-016961
Incidentally, at the time of traffic shaping, there has been a demand from network operators to acquire the bit rate at which the shaping is done and in what resolution the video is being played on a terminal. That is, network operators have a demand to acquire the relationship between a video resolution and a bit rate.
However, the resolution of the video being played on a terminal fluctuates due to fluctuations in network quality. This means that in order to check the video resolution, the network operators have to actually watch the video on the terminal. The problem is that network operators need to collect a huge amount of data to acquire the relationship between the resolution of the video and bit rate, which can incur enormous costs.
An object of the present disclosure is to provide a video quality estimation apparatus, a video quality estimation method, and a video quality estimation system that can solve the above problem and enables acquisition of a relationship between a video resolution and a bit rate without incurring enormous costs.
In an example aspect, a video quality estimation apparatus includes:
In another example aspect, a video quality estimation method includes:
In another example aspect, a video quality estimation system includes:
According to the above example aspects, it is possible to achieve an effect of providing a video quality estimation apparatus, a video quality estimation method, and a video quality estimation system that enables acquisition of a relationship between a video resolution and a bit rate without incurring enormous costs.
Prior to describing example embodiments of the present disclosure, detailed descriptions of a problem to be solved by the present disclosure and an overview of an operation of each example embodiments of the present disclosure will be provided.
First, details of the problem to be solved by the present disclosure will be described.
ABR streaming methods such as ABR (Adaptive Bit Rate) Streaming Over HTTP (Hypertext Transfer Protocol) are currently the most popular video distribution methods.
The ABR streaming methods, which are standardized by MPEG-DASH (Moving Picture Experts Group - Dynamic Adaptive Streaming over HTTP) and other standards, aim to deliver videos at maximum quality that does not exceed available band of a network.
More specifically, in the ABR streaming methods, as shown in
Here, in the ABR streaming methods, as described above, the resolution is adjusted to provide a video of stable quality within the available band of the network. Thus, the video quality can be controlled by traffic shaping.
As shown in
Here, when the traffic shaping is performed, there is a demand from a network operator to acquire what bit rate the shaping is performed and in what resolution the video is being played on the terminal 10. That is, the network operator has a demand to acquire the relationship between a video resolution and a bit rate.
If the network operator can acquire the relationship between a video resolution and a bit rate, he/she can, for example, determine an indicator of a shaping rate for providing a certain video having a certain resolution and provide a certain video having a high resolution according to the shaping rate. In addition, if the network operator can provide a certain video having a higher resolution, he/she can inform the user using the terminal 10 by saying, for example, “You can watch videos having a high resolution on this network!”.
However, the resolution of the video being played on the terminal 10 fluctuates due to fluctuations in network quality. This means that the network operator needs to actually watch the video to check the resolution of the video. The problem is that network operators need to collect a huge amount of data to know the relationship between video resolution and bit rate, which can incur enormous costs.
Each of the example embodiments of the present disclosure described below will contribute to solving the above problem to be solved.
Next, an overview of an operation of each example embodiment of the present disclosure is described.
A bit rate that corresponds to a video resolution can sometimes be obtained without actually watching the video.
For example, video quality information about a certain video such as “get_video_info” shown in
According to “get_video_info” shown in
However, a resolution distribution estimated from the video quality information is different from a resolution distribution when the video is actually shaped. This point will be described with reference to
On the other hand,
Comparing
Therefore, in each example embodiment of the present disclosure, it is assumed that the bit rate corresponding to the video resolution is estimated by modifying the video quality information by using the network quality information (e.g., throughput, frame loss rate, etc.) as shown in
More specifically, in each example embodiment of the present disclosure, it is assumed that the bit rate required to transmit a video having a certain resolution [bps], as shown in
The bit rates (1) to (3) above will be described in detail in the following respective example embodiments of the present disclosure.
Hereinafter, the details of each example embodiment of present disclosure will be described. The following descriptions and drawings have been omitted and simplified as appropriate for clarity of explanation. In each of the drawings below, the same elements are assigned the same signs, and repeated descriptions are omitted as necessary.
First, a configuration example of a video quality estimation apparatus 20 according to a first example embodiment will be described with reference to
As shown in
The network quality information collection unit 21 collects the network quality information about the network pertaining to distribution of a video. The network quality information includes a frame loss rate, an average throughput, and so on. For example, the network quality information collection unit 21 collects the network quality information set in advance from the network operator.
The network quality information DB 22 stores the network quality information collected by the network quality information collection unit 21.
When the terminal 10, which is a destination for video distribution, is a mobile terminal, the network pertaining to distribution of a video is composed of a radio network between the terminal 10 and a base station 30, which will be described later, a core network, the Internet 70, which will be described later, and a network on the side of the video distribution server 80. The core network may be an MNO (Mobile Network Operator) network 40 described later or the MNO network 40 and an MVNO (Mobile Virtual Network Operator) network 50 described later.
The video quality information collection unit 23 collects the video quality information about each of one or more videos. The video quality information includes a video resolution, a bit rate, and so on. For example, the video quality information collection unit 23 collects the video quality information set in advance from the video distribution server 80 and the network operator.
The video quality information DB 24 stores the video quality information collected by the video quality information collection unit 23.
Based on the network quality information stored in the network quality information DB 22 and the video quality information stored in the video quality information DB 24, the video quality estimation unit 25 estimates the bit rate corresponding to the resolution of the video. Furthermore, the video quality estimation unit 25 estimates a resolution distribution indicating the ratio of each resolution corresponding to the video bit rate.
Here, the video quality estimation unit 25 includes a policy influence calculation unit 251, a loss influence calculation unit 252, an overhead calculation unit 253, and a resolution distribution estimation unit 254.
The policy influence calculation unit 251 calculates (1) the incremental rate RABR [bps] due to a behavior specific to ABR streaming methods shown in
The loss influence calculation unit 252 calculates (2) the bit rate Rloss [bps] of retransmission due to loss as shown in
The overhead calculation unit 253 calculates (3) the bit rate Roverhead [bps] of overhead such as headers shown in
When the resolution distribution estimation unit 254 estimates a bit rate corresponding to the estimation target resolution of the estimation target video, it adds RABR, Rloss, and Roverhead calculated by the policy influence calculation unit 251, the loss influence calculation unit 252, and the overhead calculation unit 253, respectively, to the bit rate corresponding to the estimation target resolution, which is included in the video quality information about the estimation target video, as shown in
Hereinafter, operations of the policy influence calculation unit 251, the loss influence calculation unit 252, the overhead calculation unit 253, and the resolution distribution estimation unit 254 will be described in detail.
First, the operation of the policy influence calculation unit 251 will be described with reference to
As described with reference to
As shown in
This suggests that if the resolution of the video being played is low, the terminal 10 tends to increase the resolution.
Therefore, when the policy influence calculation unit 251 estimates the bit rate corresponding to the estimation target resolution of the estimation target video, it determines the incremental rate RABR due to the behavior specific to the ABR streaming methods according to whether the estimation target resolution, which is included in the video quality information about the estimation target video, is greater than or equal to a standard resolution. In this case, the standard resolution may be, for example, stored in advance in the network quality information DB 22.
i) If the resolution is lower than the standard resolution
If the estimation target resolution is lower than the standard resolution, it is considered that the terminal 10 requests a higher resolution video in an attempt to increase the resolution to the standard resolution.
Therefore, the policy influence calculation unit 251 determines RABR as shown in the following Expression 1.
Note that βR+1 may be a fixed value or a variable value that varies according to the size of the difference from the standard resolution.
ii) If the resolution is greater than or equal to the standard resolution
If the estimation target resolution is greater than or equal to the standard resolution, it is considered that the terminal 10 will continue to request a video in the current resolution, because there is no need to increase the resolution.
Therefore, the policy influence calculation unit 251 determines RABR as in the following Expression 2.
Next, the operation of the loss influence calculation unit 252 will be described.
When a frame loss occurs, video data for the number of losses × 1 frames is retransmitted.
Therefore, when the loss influence calculation unit 252 estimates the bit rate corresponding to the estimation target resolution of the estimation target video, it calculates the bit rate Rloss for the retransmission due to a frame loss as an expected value of the bit rate of the video data for the retransmission.
The expected value E [βR(ρ)] of the bit rate βR(ρ) of the retransmitted video data can be calculated by using the frame loss rate ρ included in the network quality information as shown in Expression 3 below.
Here, the probability of no frame loss is 1- (probability of no frame loss). Thus, the probability of no frame loss can be calculated by the following Expression 4.
The first term in the limit function of Expression 3 indicates the bit rate when no frame loss occurs, and the second term indicates the bit rate when n frame losses occur.
Proceeding with the calculation in Expression 3, the following Expression 5 is obtained.
Here, the n raised to the power of n is faster than n multiplied. Thus, when a limit is taken from n→∞), the power of n is dominant. Therefore, Expression 3 gives the above result.
Next, the operation of the overhead calculation unit 253 will be described.
When the overhead calculation unit 253 transmits a video, it transmits video data with a header attached.
Thus, when the overhead calculation unit 253 estimates a bit rate of a video, the bit rate required to transmit the header must also be taken into account as an overhead. Also, the bit rate required to transmit the header varies depending on the header size.
Here, the video data is fragmented into the size of the Maximum Transmission Unit (MTU) and then transmitted.
Therefore, when the overhead calculation unit 253 estimates the bit rate corresponding to the estimation target resolution of the estimation target video, it can calculate the bit rate Roverhead for the overhead by the header by using Expression 6 below. In this expression, η is an expected value of the header size, and β is the bit rate. Note that β is the bit rate corresponding to the estimation target resolution, which is included in the video quality information about the estimation target video.
Here, the calculation of the expected value η of the header size requires individual packets making up the frame. Therefore, as the expected value η of the header size, the maximum value is considered taking the processing load into account.
For example, if the packets making up the frame are Transmission Control Protocol (TCP)/HTTP packets, the expected header size, η, is obtained by Expression 7 below.
If the packets making up the frame are User Datagram Protocol (UDP)/Quick UDP Internet Connections (QUIC)/HTTP packets, the expected header size, η, is expressed by the following Expression 8.
Next, the operation of the resolution distribution estimation unit 254 will be described.
When the resolution distribution estimation unit 254 estimates the bit rate corresponding to the estimation target resolution of the estimation target video, it adds RABR, Rloss, and Roverhead calculated by the policy influence calculation unit 251, the loss influence calculation unit 252, and the overhead calculation unit 253, respectively, to the bit rate corresponding to the estimation target resolution, which is included in the video quality information about the estimation target video, as shown in
Therefore, when the estimation target video is played in the estimation target resolution on the terminal 10, the bit rate estimated to correspond to the estimation target resolution of the estimation target video is used as the shaping rate to shape the estimation target video.
However, shaping at a shaping rate greater than the average throughput of the network does not change the quality of the video played on the terminal 10 compared to shaping at a shaping rate equal to the average throughput.
Thus, the resolution distribution estimation unit 254 adjusts the bit rate estimated to correspond to the resolution of the estimation target resolution based on the average throughput of the network.
Specifically, the resolution distribution estimation unit 254 adjusts the estimated bit rate to the value of the average throughput if the estimated bit rate is higher than the average throughput, and otherwise leaves the estimated bit rate unchanged.
In this case, the range of the distribution of the shaping rate β would be adjusted, as shown in
If the range of the distribution of the shaping rate β is adjusted, as shown in
Next, an example of a network arrangement of the video quality estimation apparatus 20 according to the first example embodiment will be described with reference to
The video quality estimation apparatus 20 according to the first example embodiment is disposed inside a band control apparatus 200 for performing band control of videos by, for example, shaping the videos.
In the example of
In the example of
Next, an example of the operation flow of the video quality estimation apparatus 20 according to the first example embodiment will be described with reference to
As shown in
Next, the video quality information collection unit 23 collects the video quality information about each of one or more videos (Step S102). The collected video quality information is stored in the video quality information DB 24.
Note that Steps S101 and S102 are not limited to be performed in this order, but may be performed in reverse order or simultaneously.
Next, the video quality estimation unit 25 selects any one of the one or more videos whose video quality information has been collected by the video quality information collection unit 23 as an estimation target video, and selects any one of the one or more resolutions included in the video quality information about the selected estimation target video as an estimation target (Step S103).
Next, in the video quality estimation unit 25, with regard to the estimation target resolution of the estimation target video, based on the network quality information and the video quality information about the estimation target video, the policy influence calculation unit 251 calculates (1) the incremental rate RABR due to the behavior specific to the ABR streaming methods (Step S104), the loss influence calculation unit 252 calculates (2) the bit rate Rloss for retransmission due to loss (Step S105), and the overhead calculation unit 253 calculates (3) the bit rate Roverhead of overhead such as headers (Step S106).
Steps S104 to S106 are not limited to be performed in this order, but may be performed in any order or simultaneously.
Next, in the video quality estimation unit 25, the resolution distribution estimation unit 254 adds the RABR, Rloss, and Roverhead calculated in Steps S104 to S106, respectively, to the bit rate corresponding to the resolution of the estimation target included in the video quality information about the estimation target video. Next, the resolution distribution estimation unit 254 estimates the bit rate after the addition as the bit rate corresponding to the estimation target resolution of the estimation target video (Step S107). At this time, the resolution distribution estimation unit 254 may adjust the estimated bit rate based on the average throughput of the network.
Next, the video quality estimation unit 25 determines whether or not the video quality information collected by the video quality information collection unit 23 still includes a video and a resolution to be selected as the estimation target (Step S108). For example, if a condition stipulates that all or a predetermined number of video resolutions included in the video quality information should be subject to estimation and if the condition has not yet been satisfied, the determination in Step S108 is Yes.
If there is still a video and a resolution to be selected as the estimation target in Step S108 (Yes in Step S108), the video quality estimation unit 25 returns to the processing in Step S103, selects one video as the estimation target, selects one resolution of the selected video as the estimation target, and then performs the processing in Steps S104 to S107.
On the other hand, if there is no remaining video and resolution to be selected as the estimation target in Step S108 (No in Step S108), in the video quality estimation unit 25, the resolution distribution estimation unit 254 estimates a resolution distribution indicating a ratio of each resolution corresponding to the bit rate based on the estimated result of the bit rate estimated to correspond to the estimation target resolution of the estimation target video (Step S109).
As described above, according to the first example embodiment, the network quality information collection unit 21 collects the network quality information about the network pertaining to the distribution of the video. The video quality information collection unit 23 collects the video quality information about the videos. The video quality estimation unit 25 estimates the bit rate corresponding to the resolution of the video based on the network quality information and the video quality information.
Specifically, when the video quality estimation unit 25 estimates the bit rate corresponding to the estimation target resolution of the estimation target video, it adds the following bit rates (1) to (3) to the bit rate corresponding to the estimation target resolution, which is included in the video quality information about the estimation target video, and estimates the added bit rate as the bit rate corresponding to the estimation target resolution of the estimation target video.
This allows the network operator to acquire the bit rate that corresponds to the video resolution without having to actually watch the video and collect a huge amount of data. It is therefore possible to acquire the relationship between a video resolution and bit rate without incurring enormous costs.
The effect of the first example embodiment is verified with reference to
The lower left diagram of
Here, in the middle lower diagram and right lower diagram of
As shown in the middle lower diagram of
On the other hand, as shown in the lower right diagram of
Thus, it can be seen that according to the first example embodiment, it is possible to estimate the relationship between a video resolution and a bit rate, i.e., the resolution distribution indicating the ratio of each resolution corresponding to the bit rate, while taking fluctuations in network quality into account.
This makes it possible to estimate the resolution distribution for certain network quality. For example, the resolution distribution can be estimated when the network quality has an average throughput of 3 [Mbps] and a frame loss rate of 0.1 [%].
In addition, if the network operator can confirm that the video can be provided having a high resolution by referring to the resolution distribution according to the first example embodiment, he/she can inform the user using the terminal 10 by saying, for example, “You can watch videos having a high resolution on this network!”.
Moreover, the network operator will be able to use the resolution distribution according to the first example embodiment as a guide to avoid excessive shaping.
If the resolution distribution according to the first example embodiment is not present, an event such as shaping at a uniform shaping rate of 300 [kbps] occurs.
On the other hand, if the resolution distribution according to the first example embodiment is present, the resolution distribution enables the network operator to know the degree of the shaping rate at which 90% or more of videos can be provided with a resolution of 360 p or higher.
First, a configuration example of a video quality estimation apparatus 20A according to a second example embodiment will be described with reference to
As shown in
The display unit 26 displays video quality such as the resolution distribution estimated by the video quality estimation unit 25 on a screen of the video quality estimation apparatus 20A.
The video quality estimation apparatus 20A according to the second example embodiment is also different from the video quality estimation apparatus 20 according to the first example embodiment described above in that the video quality estimation apparatus 20A according to the second example embodiment assumes that there are a plurality of base stations 30 and estimates the resolution distribution for each of a plurality of areas (cells) of the plurality of base stations 30.
Therefore, the network quality information collection unit 21 collects network quality information for each of the plurality of areas. The video quality estimation unit 25 estimates the resolution distribution for each of the plurality of areas.
An overview of an operation of the video quality estimation apparatus 20A according to the second example embodiment is described below with reference to
As shown in
For each of the three areas 1 to 3 of the respective three base stations 30-1 to 30-3, the network quality information collection unit 21 collects the network quality information including a frame loss rate, an average throughput, and so on of the network in that area. Note that the networks of the three areas 1 to 3 have the same network configuration as that of the MNO network 40 and the network farther from the three base stations 30-1 to 30-3 than the MNO network 40.
In the video quality estimation unit 25, the policy influence calculation unit 251 calculates RABR, the loss influence calculation unit 252 calculates Rloss, and the overhead calculation unit 253 calculates Roverhead for each of the three areas 1 to 3. Next, the resolution distribution estimation unit 254 estimates the resolution distribution for each of the three areas 1 to 3. Note that the method of estimating the resolution distribution itself is the same as that according to the first example embodiment described above, and thus a description thereof is omitted.
Furthermore, the resolution distribution estimation unit 254 estimates the average resolution for each of the three areas 1 to 3 based on the average throughput and the resolution distribution. For example, in the resolution distribution, the resolution distribution estimation unit 254 estimates, as the average resolution, the resolution with the highest ratio when the bit rate corresponds to the average throughput. Specifically, it is assumed that the estimated resolution distribution for an area is the resolution distribution shown in the lower right diagram of
Next, the display unit 26 displays each of the three areas 1 to 3 on a map and further displays the average resolution of each of the three areas 1 to 3 on the screen of the video quality estimation apparatus 20A.
Note that a display example by the display unit 26 in
In the display example of
Next, an example of the operation flow of the video quality estimation apparatus 20A according to the second example embodiment will be described with reference to
As shown in
Next, the resolution distribution estimation unit 254 estimates the average resolution for each of the three areas 1 to 3 based on the average throughput and the resolution distribution (Step S210).
After that, the display unit 26 displays each of the three areas 1 to 3 on the map and further displays the average resolution of each of the three areas 1 to 3 (Step S211).
As described above, according to the second example embodiment, the video quality estimation unit 25 estimates the resolution distribution for each of the plurality of areas and further estimates the average resolution. The display unit 26 displays each of the plurality of areas on a map and further displays the average resolution of each of the plurality of areas.
This allows a user to know at what level of resolution the video can be delivered in each of the plurality of areas.
Other effects are similar to those of the aforementioned first example embodiment.
Here, a modified example according to the second example embodiment will be described with reference to
As shown in
The resolution distribution estimation unit 254 estimates the average resolution for each of the three areas 1 to 3.
At this time, in an area where the number of camping terminals 10 is large, there is a possibility that the allocated band of the network slice may become insufficient and the average resolution may become lower than a target resolution.
Therefore, when there is an area in which the average resolution is lower than the target resolution, the resolution distribution estimation unit 254 may increase the band of the network slice allocated to that area. In this case, the resolution distribution estimation unit 254 may inform the component responsible for allocating the band of the network slice to each area to increase the band of the network slice allocated to a certain area.
It is preferable that the target resolution be common in a plurality of areas, and instead it may be different in each of the plurality of areas. The target resolution may be stored in advance in, for example, the network quality information DB 22.
In the aforementioned first and second example embodiments, the components according to the present disclosure are arranged in one apparatus (in each of the video quality estimation apparatuses 20 and 20A), but the present disclosure is not limited to this. The components of the video quality estimation apparatuses 20 and 20A may be distributed over the network.
In the first and second example embodiments described above, the bit rate corresponding to the estimation target resolution is estimated by adding the following bit rates (1) to (3) to the bit rate corresponding to the estimation target resolution, which is included in the video quality information about the estimation target video, but the present disclosure is not limited to this.
Even if only any one or two of the bit rates from (1) to (3) above are added, the estimated resolution distribution is considered to be close to the resolution distribution when videos are actually shaped. Therefore, any one or two bit rates from (1) to (3) above may be selected and only the selected bit rate(s) may be added. In this case, an amount of calculation can be reduced as compared to that when all the bit rates from (1) to (3) above are added.
Next, a configuration example of a video quality estimation apparatus 100 conceptually showing the video quality estimation apparatuses 20 and 20A according to the aforementioned first and second example embodiments will be described with reference to
The video quality estimation apparatus 100 shown in
The first collection unit 101 corresponds to the network quality information collection unit 21 according to the aforementioned first and second example embodiments. The first collection unit 101 collects the network quality information about the network pertaining to the distribution of the video. The network quality information includes, for example, a frame loss rate, an average throughput, and so on of a network.
The second collection unit 102 corresponds to the video quality information collection unit 23 according to the aforementioned first and second example embodiments. The second collection unit 102 collects the video quality information about each of one or more videos. The video quality information includes, for example, the resolution of the video, a second bit rate of the video corresponding to the resolution, and so on.
The estimation unit 103 corresponds to the video quality estimation unit 25 according to the aforementioned first and second example embodiments. The estimation unit 103 estimates a first bit rate corresponding to the resolution of the video based on the network quality information and the video quality information.
At this time, the estimation unit 103 may specify a value to be added to the second bit rate corresponding to the resolution of the video included in the video quality information based on the network quality information and the video quality information, and estimate the first bit rate corresponding to the resolution of the video. More specifically, the estimation unit 103 may add a value to be added specified above to the second bit rate corresponding to the resolution of the video contained in the video quality information, and estimate the bit rate after the addition as the first bit rate corresponding to the resolution of the video.
If the resolution of the video included in the video quality information is lower than the standard resolution, the estimation unit 103 may add a predetermined bit rate to the second bit rate corresponding to the resolution of the video included in the video quality information as a value to be added.
The estimation unit 103 may also calculate the bit rate required for retransmitting the video data due to the frame loss based on the frame loss rate. Next, the estimation unit 103 may add the bit rate required for retransmitting the video data as a value to be added to the second bit rate corresponding to the resolution of the video included in the video quality information.
The estimation unit 103 may also calculate the bit rate required for transmitting the header based on the size of the header of the packet of the video data. Then, the estimation unit 103 may add the bit rate required for transmitting the header as a value to be added to the second bit rate corresponding to the resolution of the video included in the video quality information.
If the first bit rate estimated to correspond to the resolution of the video is higher than the average throughput, the estimation unit 103 may adjust the first bit rate to the value of the average throughput.
The estimation unit 103 may also estimate the first bit rate corresponding to one or more resolutions of one or more videos, and estimate a resolution distribution indicating a ratio of each resolution corresponding to the first bit rate based on the estimation result.
The video quality estimation apparatus 100 may further include a display unit. This display unit corresponds to the display unit 26 according to the second example embodiment described above. The estimation unit 103 may also estimate the resolution distribution for each of the plurality of areas and estimate the average resolution based on the estimated resolution distribution and the average throughput. The display unit may display each of the plurality of areas on the map and display the average resolution of each of the plurality of areas. Alternatively, the display unit may display the first bit rate corresponding to the resolution of the video estimated by the estimation unit 103.
In addition, the band of the network slice may be allocated to each of the plurality of areas. If there is an area with an estimated average resolution lower than the target resolution among the plurality of areas, the estimation unit 103 may increase the band of the network slice allocated to that area.
Next, an example of an operation flow of the video quality estimation apparatus 100 shown in
As shown in
Next, the second collection unit 102 collects the video quality information about the video (Step S302).
Steps S301 to S302 are not limited to be performed in this order, but may be performed in any order or simultaneously.
After that, the estimation unit 103 estimates the bit rate corresponding to the resolution of the video based on the network quality information collected in Step S301 and the video quality information collected in Step S302 (Step S303).
As described above, according to the video quality estimation apparatus 100 shown in
This allows network operators to acquire the bit rate that corresponds to the video resolution without having to actually watch the video and collect a huge amount of data. It is therefore possible to acquire the relationship between a video resolution and bit rate without incurring enormous costs.
Next, a configuration example of a video quality estimation system including the video quality estimation apparatus 100 shown in
The video quality estimation system shown in
The terminal 10 and the video quality estimation apparatus 100 are connected to the network 110.
The video distribution server 80 on the network 110 distributes videos to the terminal 10.
When the terminal 10 is a mobile terminal, the network 110 is a network composed of a radio network between the terminal 10 and the base station 30, a core network, the Internet 70, and a network on the side of the video distribution server 80. The core network may be MNO network 40 or MNO network 40 and MVNO network 50.
Next, a hardware configuration of a computer 90 for implementing the video quality estimation apparatuses 20 and 20A according to the aforementioned first and second example embodiments and the video quality estimation apparatus 100 relating to the concept of the above example embodiments will be described with reference to
As shown in
The processor 91 is, for example, an arithmetic processing unit such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). The memory 92 is, for example, a memory such as RAM (Random Access Memory) or ROM (Read Only Memory). The storage 93 is, for example, a storage apparatus such as an HDD (Hard Disk Drive), an SSD (Solid State Drive), or a memory card. The storage 93 may be a memory such as RAM or ROM.
The storage 93 stores programs for implementing the functions of the components included in the video quality estimation apparatuses 20, 20A, and 100. By executing each of these programs, the processor 91 implements the functions of the components included in the video quality estimation apparatuses 20, 20A, and 100. Here, the processor 91 may execute the above programs after reading them into the memory 92, or may execute them without reading them into the memory 92. The memory 92 and the storage 93 also serve to store information and data held by the components of the video quality estimation apparatuses 20, 20A, and 100.
Further, the above program can be stored and provided to a computer (including the computer 90) using any type of non-transitory computer readable media. Non-transitory computer readable media include any type of tangible storage media. Examples of non-transitory computer readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g. magneto-optical disks), CD-ROM (compact disc ROM), CD-R (CD-Recordable), CD-R/W (CD-Rewritable), and semiconductor memories (such as mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM.
The program may be provided to a computer using any type of transitory computer readable media. Examples of transitory computer readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer readable media can provide the program to a computer via a wired communication line (e.g. electric wires, and optical fibers) or a wireless communication line.
The input/output interface 94 is connected to a display apparatus 941, an input apparatus 942, a sound output apparatus 943, etc. The display apparatus 941 is an apparatus that displays a screen corresponding to drawing data processed by the processor 91, such as an LCD (Liquid Crystal Display), CRT (Cathode Ray Tube) display, or monitor. The input apparatus 942 is an apparatus that accepts the operator’s operational input, such as a keyboard, mouse, and touch sensor. The display apparatus 941 and the input apparatus 942 may be integrated and implemented as a touch panel. The sound output apparatus 943 is an apparatus such as a speaker that outputs sound corresponding to the sound data processed by the processor 91.
The communication interface 95 transmits and receives data to and from an external apparatus. For example, the communication interface 95 communicates with the external apparatus via a wired or wireless channel.
Although the present disclosure has been described above with reference to the example embodiments, the disclosure is not limited to the example embodiments described above. Various changes in the configuration and details of the present disclosure may be made that would be understandable to a person skilled in the art within the scope of the present disclosure.
The whole or part of the example embodiments disclosed above can be described as, but not limited to, the following supplementary notes.
A video quality estimation apparatus comprising:
The video quality estimation apparatus according to Supplementary note 1, wherein
The video quality estimation apparatus according to Supplementary note 2, wherein
the estimation unit adds a predetermined bit rate as the value to be added to the second bit rate corresponding to the resolution of the video included in the video quality information when the resolution of the video included in the video quality information is lower than a standard resolution.
The video quality estimation apparatus according to Supplementary note 2 or 3, wherein
The video quality estimation apparatus according to any one of Supplementary notes 2 to 4, wherein
The video quality estimation apparatus according to any one of Supplementary notes 2 to 5, wherein
The video quality estimation apparatus according to any one of Supplementary notes 1 to 6, further comprising:
a display unit configured to display the first bit rate corresponding to the resolution of the video estimated by the estimation unit.
The video quality estimation apparatus according to any one of Supplementary notes 2 to 6, wherein
the estimation unit estimates the first bit rate corresponding to each of one or more of the resolutions of one or more of the videos, and estimates a resolution distribution indicating a ratio of each resolution corresponding to the first bit rate based on a result of the estimation.
The video quality estimation apparatus according to Supplementary note 8, further comprising:
a display unit, wherein
The video quality estimation apparatus according to Supplementary note 9, wherein
A video quality estimation method comprising:
The video quality estimation method according to Supplementary note 11, wherein
The video quality estimation method according to Supplementary note 12, wherein
in the estimating, a predetermined bit rate as the value to be added is added to the second bit rate corresponding to the resolution of the video included in the video quality information when the resolution of the video included in the video quality information is lower than a standard resolution.
The video quality estimation method according to Supplementary note 12 or 13, wherein
The video quality estimation method according to any one of Supplementary notes 12 to 14, wherein
The video quality estimation method according to any one of Supplementary notes 12 to 15, wherein
The video quality estimation method according to any one of Supplementary notes 11 to 16, further comprising:
displaying the first bit rate corresponding to the resolution of the video estimated in the estimating.
The video quality estimation method according to any one of Supplementary notes 12 to 16, wherein
in the estimating, the first bit rate corresponding to each of one or more of the resolutions of one or more of the videos is estimated, and a resolution distribution indicating a ratio of each resolution corresponding to the first bit rate based on a result of the estimation is estimated.
The video quality estimation method according to Supplementary note 18, wherein
The video quality estimation method according to Supplementary note 19, wherein
A video quality estimation system comprising:
The video quality estimation system according to Supplementary note 21, wherein
The video quality estimation system according to Supplementary note 22, wherein
the estimation unit adds a predetermined bit rate as the value to be added to the second bit rate corresponding to the resolution of the video included in the video quality information when the resolution of the video included in the video quality information is lower than a standard resolution.
The video quality estimation system according to Supplementary note 22 or 23, wherein
The video quality estimation system according to any one of Supplementary notes 22 to 24, wherein
The video quality estimation system according to any one of Supplementary notes 22 to 25, wherein
The video quality estimation system according to any one of Supplementary notes 21 to 26, further comprising:
a display unit configured to display the first bit rate corresponding to the resolution of the video estimated by the estimation unit.
The video quality estimation system according to any one of Supplementary notes 22 to 26, wherein
the estimation unit estimates the first bit rate corresponding to each of one or more of the resolutions of one or more of the videos, and estimates a resolution distribution indicating a ratio of each resolution corresponding to the first bit rate based on a result of the estimation.
The video quality estimation system according to Supplementary note 28, further comprising:
a display unit, wherein
The video quality estimation system according to Supplementary note 29, wherein
10
20,20A
21
22
23
24
25
251
252
253
254
26
30
40
41
42
43
44
50
51
52
53
60
70
80
90
91
92
93
94
941
942
943
95
100
101
102
103
110
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/032174 | 8/26/2020 | WO |