The disclosure relates to a method and a control system for playing a three-dimensional video, and more particularly, to a method and a control system for playing a three-dimensional video using visual fatigue estimation.
The ergonomic issue of three-dimensional displaying has always been existed. For instance, the three-dimensional videos may easily cause visual fatigue for viewers. In recent years, many research institutions have carried out various research investigations on the effect of the content of the three-dimensional video to humans, and the industry currently attempts to set a standard for three-dimensional display according to the results from the research investigations.
Among the current technologies, using disparity adjustment to reduce the viewer's visual fatigue while watching a three-dimensional video has been employed by majority in the related industries. Disparity adjustment is performed, mainly, according to a disparity range in the recent technologies. However, if only the disparity range of the three-dimensional video is considered, it is not enough for viewers to prevent from visual fatigue.
Accordingly, the disclosure is a method and a control system for playing a three-dimensional video using visual fatigue estimation.
The disclosure provides a method for playing a three-dimensional video, which includes the following steps. A disparity velocity or a disparity acceleration for at least one continuous video in the three-dimensional video is calculated. A visual fatigue estimating value of a viewer is calculated according to the disparity velocity or the disparity acceleration. A subsequent playback of the three-dimensional video is controlled according to the visual fatigue estimating value.
The disclosure provides a control system adapted to control a playback of a three-dimensional video. The control system includes a three-dimensional video stream input unit, a disparity estimation unit, a visual fatigue estimation unit, a fatigue recovery control unit and a three-dimensional video stream display unit. The disparity estimation unit is coupled to the three-dimensional video stream input unit, and the disparity estimation unit calculates a disparity velocity or a disparity acceleration for at least one continuous video in the three-dimensional video. The visual fatigue estimation unit is coupled to the disparity estimation unit, and the visual fatigue estimation unit calculates a visual fatigue estimating value of a viewer according to the disparity velocity or the disparity acceleration. The fatigue recovery control unit is coupled to the three-dimensional video stream input unit and the visual fatigue estimation unit, and the fatigue recovery control unit controls a subsequent playback of the three-dimensional video according to the visual fatigue estimating value. The three-dimensional video stream display unit is coupled to the fatigue recovery control unit.
Several exemplary embodiments accompanied with figures are described in detail below to further describe the disclosure in details.
The accompanying drawings are included to provide further understanding, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments and, together with the description, serve to explain the principles of the disclosure.
In an embodiment of the disclosure, the visual fatigue estimating value (Fa) may be calculated and obtained according to the disparity velocity (DS) for the at least one continuous video in the three-dimensional video. Here, the disparity velocity (DS) can be defined as a change of a disparity range within a time unit. For example, the visual fatigue estimating value (Fa) may be calculated through Formula (1) and Formula (2) as follow, in which fa is a visual fatigue estimating value of the viewer after watching at each of time units in the continuous video, Fa is an overall visual fatigue estimating value of the viewer after watching the continuous video, and T is a time.
fa=f(DS) (1)
Fa=f(fa,T) (2)
It can be known from the above Formulas (1) and (2) that, the visual fatigue estimating value (fa) is related to the disparity velocity (DS), whereas the overall visual fatigue estimating value (Fa) is related to the visual fatigue estimating value (fa) and the time (T). When the absolute value of the disparity velocity (DS) becomes greater, the visual fatigue estimating value (fa) becomes higher. Contrarily, when the absolute value of the disparity velocity (DS) becomes smaller (that is, when the disparity velocity (DS) is closer to zero), the visual fatigue estimating value (fa) becomes lower.
In another embodiment of the disclosure, the visual fatigue estimating value (Fa) may be calculated and obtained according to the disparity acceleration (DA) for the at least one continuous video in the three-dimensional video. Here, the disparity acceleration (DA) can be defined as a change of the disparity velocity (DS) within a time unit. For example, the visual fatigue estimating value (Fa) may be calculated through Formula (3) and Formula (4) as follow, in which fa is a visual fatigue estimating value of the viewer after watching at each of the time units in the continuous video, Fa is an overall visual fatigue estimating value of the viewer after watching the continuous video, and T is a time.
fa=f(DA) (3)
Fa=f(fa,T) (4)
Similarly, as shown in
In other embodiments of the disclosure, other than the disparity velocity (DS) and/or the disparity acceleration (DA) described above, parameters such as time (T), temporal weight (RT), disparity velocity weight (Vi), disparity range weight (Wd), disparity mean position (P) and disparity direction (DD), may also served as the parameters for calculating the overall visual fatigue estimating value (Fa).
For example, the visual fatigue estimating value (Fa) may be calculated through Formula (5) and Formula (6) as follow, in which X in Formula (5) can be at least one of the afore-described disparity acceleration (DA), time (T), temporal weight (RT), disparity velocity weight (Vi), disparity range weight (Wd), disparity mean position (P), disparity direction (DD), lateral velocity, brightness and contrast, while fa is the visual fatigue estimating value of the viewer after watching at each of the time units in the continuous video, and Fa is the overall visual fatigue estimating value of the viewer after watching the continuous video.
fa=f(DS,X) (5)
Fa=f(fa,T) (6)
As shown in
In another embodiment, the visual fatigue estimating value (Fa) may also be calculated through Formula (7) and Formula (8) as follow, in which X in Formula (7) can be at least one of the afore-described disparity velocity (DS), time (T), temporal weight (RT), disparity velocity weight (V), disparity range weight (Wd), disparity mean position (P), disparity direction (DD), lateral velocity, brightness and contrast, while fa is the visual fatigue estimating value of the viewer after watching at each of the time units in the continuous video, and Fa is the overall visual fatigue estimating value of the viewer after watching the continuous video.
fa=f(DA,X) (7)
Fa=f(fa,T) (8)
As shown in
The disclosure provides a control system 100 to implement the afore-described method for playing the three-dimensional video, in order to control the playback of the three-dimensional video and effectively control the visual fatigue of the viewer.
In the embodiment, the three-dimensional video stream input unit 101 reads a three-dimensional video stream from a video storage medium or an internet video stream into the control system 100. For example, the three-dimensional video stream read by the three-dimensional video stream input unit 101 can be a three-dimensional video stream with single view or a three-dimensional video stream with multi view.
After the three-dimensional video stream is outputted from the three-dimensional video stream input unit 101, the disparity estimation unit 102 divides the three-dimensional video stream into a plurality of video streams, and calculates the disparity velocity (DS) and/or the disparity acceleration (DA) for the video stream at each of the time units. In the embodiment, each of the time units is two seconds, for instance. In other embodiments, each of the time units is corresponded to one or more group of pictures (GOP) in the three-dimensional video, for instance. However, the method for the disparity estimation unit 102 to divide the three-dimensional video stream is not limited in the disclosure.
In the embodiment, the disparity estimation unit 102 calculates one or several disparity velocities (DS) and/or the disparity accelerations (DA) for the video stream after being divided by each of the time units. For example, the disparity estimation unit 102 may calculate one disparity velocity (DS) for the video stream at each of the time units. Alternatively, the disparity estimation unit 102 may calculate the disparity velocity (DS), a maximum disparity velocity (DSmax) and/or a minimum disparity velocity (DSmin) for the video stream at each of the time units, or else, the disparity estimation unit 102 may calculate the disparity acceleration (DA) for the video stream at each of the time units. In other embodiments, the disparity estimation unit 102 may further calculate other different parameters such as the disparity range (D), the lateral velocity, the content brightness or the contrast, for the video stream at each of the time units.
Additionally, the calculation of the disparity velocity (DS) and/or the disparity acceleration (DA) through the disparity estimation unit 102 is to perform on all or a part of regions in the continuous video. For example, the disparity estimation unit 102 can only calculate the disparity velocity (DS) and/or the disparity acceleration (DA) of the central region in the continuous video, or can only calculate the disparity velocity (DS) and/or the disparity acceleration (DA) of a dynamic body object in the continuous video.
The disparity velocity (DS) and/or the disparity acceleration (DA) calculated by the disparity estimation unit 102 may be transferred to the visual fatigue estimation unit 103. The visual fatigue estimation unit 103 calculates the overall visual fatigue estimating value (Fa) according to the disparity velocity (DS) and/or the disparity acceleration (DA).
Subsequently, the visual fatigue estimation unit 103 transfers the calculated overall visual fatigue estimating value (Fa) to the fatigue recovery control unit 104, in order for the fatigue recovery control unit 104 to control the subsequent playback of the three-dimensional video stream display unit 105 according to the overall visual fatigue estimating value (Fa), thereby effectively reducing the visual fatigue of the viewer. For example, when the overall visual fatigue estimating value (Fa) is excessively large, the fatigue recovery control unit 104 is capable of reducing the visual fatigue of the viewer by means such as decreasing the disparity range, decreasing the disparity velocity, decreasing the disparity acceleration, lowering the display contrast, lowering the display brightness, varying the playback velocity, directly switching to the two-dimensional display or generating an alter to notice the viewer. Contrarily, when the overall visual fatigue estimating value (Fa) has been sustained small, the fatigue recovery control unit 104 is capable of enhancing the three-dimensional perception effect of the images by means such as increasing the disparity range, increasing the disparity velocity, increasing the disparity acceleration.
The three-dimensional video stream display unit 105 displays the images after being processed by the fatigue recovery control unit 104 to the viewer.
The method for calculating the visual fatigue estimating value (Fa) accompanied with Example 1 through Example 10 is illustrated in the following to further describe the disclosure in details.
The disparity estimation unit 102 (as shown in Fa). Here, the disparity velocity (DS) can be the mean disparity velocity or the maximum disparity velocity within the certain time, for instance. In Example 1, the formula of calculating the visual fatigue estimating value Fa is: Fa=fa=DS.
The disparity estimation unit 102 (as shown in
The disparity estimation unit 102 (as shown in
The disparity estimation unit 102 (as shown in
As shown in
It is assumed that the disparity velocity (DS) of the video stream for each of the time units within a certain time interval in the three-dimensional video is distributed between DS1 to DS3. When the disparity velocity (DS) of the video stream falls within the range of DS1 to DS2, a low numerical value should be selected for the disparity velocity weight (Vi), so as to calculate the visual fatigue estimating value (fa) of such video stream. When the disparity velocity (DS) of the video stream falls within the range of DS2 to DS3, a high numerical value should be selected for the disparity velocity weight (Vi), so as to calculate the visual fatigue estimating value (fa) of another video stream. For example, the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the disparity velocity (DS) (i.e., the non-linear relationship).
According to the above described, the overall visual fatigue estimating value (Fa) can then be obtained by Formula: Fa=ΣT
The disparity estimation unit 102 (as shown in
The disparity estimation unit 102 (as shown in
As shown in
It is assumed that the disparity velocity (DS) for each of the video streams within a certain time interval in the three-dimensional video is distributed between DS1 to DS3, and the disparity range (D) is between D1 and D2. When the disparity range (D) of the video stream falls nearby D1, a low numerical value should be selected for the disparity range weight (Wd). When the disparity range (D) of the video stream falls nearby D2, a high numerical value should be selected for the disparity range weight (Wd). Additionally, when the disparity velocity (DS) of the video stream falls within the range of DS1 to DS2, a low numerical value should be selected for the disparity velocity weight (Vi). When the disparity velocity (DS) of the video stream falls within the range of DS2 to DS3, a high numerical value should be selected for the disparity velocity weight (Vi). For example, the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the disparity velocity (DS) (i.e., the non-linear relationship), and the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the disparity range (D) (i.e., the non-linear relationship).
According to the above described, the overall visual fatigue estimating value (Fa) can then be obtained by Formula: Fa=ΣT
The disparity estimation unit 102 (as shown in
According to the above described, the overall visual fatigue estimating value (Fa) can then be obtained by Formula: Fa=ΣT
As shown in
When the disparity range (D) of the video stream falls nearby D1, since the disparity range is relatively small, the disparity velocities (i.e., DS1, DS2 and DS3) have no significantly impact on the subjective fatigues. At the moment, the visual fatigue estimating value (fa) may apply to the calculation method. When the disparity range (D) of the video stream falls nearby D2 or D3, since the disparity range is relatively large, the disparity velocities (i.e., DS1, DS2 and DS3) have significantly impact on the subjective fatigues. At the moment, the visual fatigue estimating value (fa) may apply to the another calculation method.
The disparity estimation unit 102 (as shown in
In Example 8, the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the disparity velocity (DS) (i.e., the non-linear relationship), the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the disparity range (D) (i.e., the non-linear relationship), and the visual fatigue estimating value (fa) can be a quadratic or higher-order function of the time (T) (i.e., the non-linear relationship).
The disparity estimation unit 102 (as shown in
In Example 9, the formula of calculating the visual fatigue estimating value (fa) is: fa=DSiQj, and the formula of calculating the visual fatigue estimating value (Fa) is: Fa=ΣT
The disparity estimation unit 102 (as shown in
In summary, since the disclosure calculates the visual fatigue estimating value according to the disparity velocity or the disparity acceleration, the method and the control system in the disclosure are capable of reducing the visual fatigue of the viewer effectively.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2013 1 0073099 | Mar 2013 | CN | national |
This application claims the priority benefits of U.S. provisional application Ser. No. 61/715,792, filed on Oct. 18, 2012 and China application serial no. 201310073099.3, filed on Mar. 7, 2013. The entirety of each of the above-mentioned patent applications is hereby incorporated by reference herein and made a part of this specification.
Number | Name | Date | Kind |
---|---|---|---|
4647965 | Imsand | Mar 1987 | A |
5726704 | Uomori | Mar 1998 | A |
6677939 | Uchiyama | Jan 2004 | B2 |
6798406 | Jones et al. | Sep 2004 | B1 |
6996267 | Tabata | Feb 2006 | B2 |
7417664 | Tomita | Aug 2008 | B2 |
7557824 | Holliman | Jul 2009 | B2 |
7623089 | Sato | Nov 2009 | B2 |
7720308 | Kitaura et al. | May 2010 | B2 |
7944444 | Ha et al. | May 2011 | B2 |
8045792 | Koo et al. | Oct 2011 | B2 |
20040057612 | Tabata | Mar 2004 | A1 |
20090096863 | Kim et al. | Apr 2009 | A1 |
20100275238 | Nagasawa | Oct 2010 | A1 |
20110063421 | Kubota | Mar 2011 | A1 |
20110142309 | Zhang | Jun 2011 | A1 |
20110267338 | Nam et al. | Nov 2011 | A1 |
20120148147 | Ogata et al. | Jun 2012 | A1 |
Number | Date | Country |
---|---|---|
1643939 | Jul 2005 | CN |
2357839 | Aug 2011 | EP |
4149037 | Sep 2008 | JP |
Entry |
---|
Hyung-Chul O. Li et al., “Measurement of 3D Visual Fatigue Using Event-Related Potential (ERP): 3D Oddball Paradigm,” 3DTV Conference: The True Vision—Capture, Transmission and Display of 3D Video, May 28-30, 2008, pp. 213-216. |
Kazuhiko Ukai et al., “Visual fatigue caused by viewing stereoscopic motion images: Background, theories, and obsertvations,” Health and Safety Aspects of Visual Displays, vol. 29, Issue 2, Mar. 2008, pp. 106-116. |
Manuel Lang et al., “Nonlinear Disparity Mapping for Stereoscopic 3D,” ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH, vol. 29, Issue 4, Jul. 2010, pp. 1-10. |
Yu-Ting Lin et al., “Evaluation of Image Quality and Subjective Visual Fatigue in Autostereoscopic Displays for 3D Dynamic Image Design,” International Meeting on Information Display DIGEST, Aug. 28-31, 2012, pp. 1-2. |
Geng Sun et al., “Evaluating Method for Controlling Depth Perception in Stereoscopic Cinematography,” SPIE Proceedings vol. 7237, Feb. 12, 2009, pp. 1-12. |
Marc Lambooij et al., “Visual Discomfort and Visual Fatigue of Stereoscopic Displays: A Review,” Journal of Imaging Science and Technology, vol. 53, No. 3, May 2009, pp. 30201-1-30201-14. |
“Office Action of Chinese Counterpart Application”, issued on May 14, 2015, p. 1-p. 5. |
Number | Date | Country | |
---|---|---|---|
20140111626 A1 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
61715792 | Oct 2012 | US |