Embodiments described herein relate generally to a video reproduction apparatus, a video reproduction method and a video display apparatus.
Recent video display apparatuses are able to display, by the employment of a wide color gamut display panel, video images having a wider color gamut (wide gamut: ITU-R BT.2020, referred to as BT.2020) than a standard color gamut (narrow gamut: ITU-R BT.709-3, referred to as BT.709). These apparatuses are able to display video images faithful to materials, exhibiting clean colors, and imparting natural impression. For such a wide-gamut-compliant video display apparatus, a video reproduction apparatus is required to output images without increasing their color gamut, because if the color gamut of video images having a wide gamut obtained by an imaging device having a wide color gamut is further increased, the colors having the video images become too deep, thereby degrading the image quality. Further, if video images having a narrow gamut obtained by an imaging device having a narrow color gamut are directly output, the color gamut and/or performance of a display apparatus having a wide-color gamut signal standard cannot be fully realized, thereby displaying video images lacking vividness. Therefore, in this case, it is desirable to perform appropriate color gamut increase processing.
As described above, in video reproduction apparatuses, there is a demand for displaying many video images in a wide color gamut by selectively increasing the color gamut in accordance with the original color gamut in which a video image corresponding to an input video signal was obtained. However, at present, attribute information associated with a video signal does not include original color information. Further, even if original color information is added in future, it is not guaranteed that the information will be always correct. Thus, in conventional video reproduction apparatuses, it is difficult to automatically set an appropriate color space by real-time processing to thereby reproduce video images created under various color gamut setting conditions.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
Various embodiments will be described hereinafter with reference to the accompanying drawings. In general, according to one embodiment, a video reproduction apparatus comprises: a chromaticity diagram coordinate calculator which calculates two-dimensional coordinate data of an input video signal on a chromaticity diagram; a color gamut determining unit which compares the two-dimensional coordinate data calculated by the chromaticity diagram coordinate calculator, with a preset standard color gamut on the chromaticity diagram, and determines, based on a result of the comparison, whether a color gamut corresponding to the input video signal falls within the standard color gamut; and a color gamut increasing unit which adjusts color gamut increase processing and performs the adjusted color gamut increase processing on the input video signal, based on a determination result of the color gamut determining unit.
The digital tuner unit 102 inputs, to a TS processor 122, obtained transport streams (TS) corresponding to a plurality of channels. Each TS includes a packet sequence associated with a broadcast program corresponding to a channel. There is a packet including, for example, control information.
The TS processor 122 multiplexes a plurality of TSs corresponding to a plurality of channels to form one TS. The multiplexed TS includes packet sequences corresponding to the broadcast programs of the channels. A packet corresponding to each channel additionally includes identification information for channel and packet identification.
The multiplexed TS is input to and stored in a storage unit 111. The packets containing control information and included in the TSs input to the TS processor 122 are further input to and processed by a controller 200.
The storage unit 111 includes, for example, a hard disk drive and an optical disk recording/reproducing unit. The optical disk includes a digital versatile disk (DVD (trademark)), a blue-ray disk (BD (trademark)), etc.
The control information contained in a packet sent from the TS processor 122 to the controller 200 includes, for example, an entitlement control message (ECM) as encrypted information of broadcast programs, an event information table (EIT) showing event information, such as program names, performers and start times, and an electronic program guide (EPG).
Video data contained in a packet is already encoded utilizing, for example, Moving Picture Expert Group (MPEG) or advanced video coding (AVC). Further, audio data in an audio packet is already encoded by, for example, pulse code modulation (PCM), Dolby scheme or MPEG.
The TS processor 122 can select a TS from the storage unit 111 or the digital tuner unit 102 to perform reproduction, based on a control signal from the controller 200. In other words, the TS processor 122 can separate an audio packet containing audio data associated with a program to be reproduced, from a video packet containing video data associated with the program, based on the control signal from the controller 200.
The audio packet containing the audio data and separated from the packet sequence by the TS processor 122 is input to an audio decoder 123, where decoding corresponding to the encoding scheme is performed. The audio data decoded by the audio decoder 123 is sent to an audio data processor 124, where synchronization processing, volume adjustment, etc., are performed. The resultant data is supplied to an audio output unit 125. The audio output unit 125 performs, for example, stereoscopic separation processing corresponding to a speaker system employed, and supplies an output to a loud speaker 126.
The video packet containing video data and separated from a packet sequence by the TS processor 122 is input to a video decoder 131, where decoding corresponding to the encoding scheme is performed. The video data decoded by the video decoder 131 is sent to a video data processor 132, where synchronization processing, luminance adjustment, color adjustment, etc., are performed. The video data processor 132 functions as a video reproduction apparatus according to the first embodiment, and comprises a color gamut correcting unit 1 that determines the color gamut of video data, and corrects the color gamut based on the result of the determination. The output of the video data processor 132 is sent to a video output unit 133.
For instance, the video output unit 133 can superimpose, upon a main video signal, data, figure and a program table sent from the controller 200. Further, the video output unit 133 sets, for an output video signal, scale, resolution, the number of lines, aspect ratio, etc. corresponding to a display 134, and outputs them to the display 134. The display 134 is a video display apparatus compliant with a wide color gamut.
There is a case where an audio packet and a video packet for a pay program are encrypted. In this case, a processing system for decrypting the encryption using key information is also used. However, description of this system is omitted.
The controller 200 comprises a central processing unit (CPU) 201, a command processor 202, a communication controller 203, a device managing unit 204, a display controller 211, an on-screen display (OSD) block 212, a memory 213, etc.
The controller 200 also comprises an EPG data processor for generating a program table signal using EPG data, and a recording/reproducing controller (not shown in
The CPU 201 performs adjustment of the entire operation sequence of the controller 200. The command processor 202 can analyze externally input operation commands and reflect operations corresponding to the commands in the television receiver 100. The device managing unit 204 stores device identification data associated with a mobile terminal 500, a remote controller 115, etc., that supply operation signals to the controller 200.
The display controller 211 can supply a program table signal or a menu video signal to the video output unit 133 via the OSD block 212. The display controller 211 can also perform adjustment processing associated with the resolution of an image signal, a display size, a display area, etc.
The memory 213 can store various types of data and applications, etc., to be stored within the controller 200.
The communication controller 203 can communicate with external devices to obtain operation commands, data, content, etc. The obtained content and data can be stored in, for example, the storage unit 111 or the memory 213. The communication controller 203 can transmit data, content, etc., from the electronic device 100 to external devices. For instance, the communication controller 203 can transmit data on a program list generated by a processor 330 to an external mobile terminal 500, such as a smartphone or a tablet.
The communication controller 203 is connected to a receiver (a short-range communication unit 112 and a long-range communication unit 113). The short-range communication unit 112 can transmit and receive data to and from the mobile terminal 500, and is used for short-range communication. By inputting an instruction to the instruction input unit of the mobile terminal 500, the operation of the system 100 can be controlled. The mobile terminal 500 can receive a program list generated by the processor 330, as well as video and audio data, and can display them.
The long-range communication unit 113 can transmit and receive data via the Internet to and from a remote server, a home server or a cloud server. The long-range communication unit 113 communicates with, for example, the remote server via radio or via a fixed line (an optical cable, a local area network). The remote server has a receiver for receiving a command signal from the remote communication unit 113.
The system 100 can also receive an operation signal from the remote controller 115 via a receiver (remote controller communication unit 114). The remote controller 115 has an instruction input unit, like the mobile terminal 500.
The mobile terminal 500 can access a server via a base station (not shown), the Internet, etc. It can download various types of applications, game software, etc. from the server, as well as the content served by the server, and transfer them to the controller 20 via the short-range communication unit 112.
The mobile terminal 500 can also transfer information (such as a web server address, a mail address and a network address), used to obtain content and various types of served information, to the controller 200 via the short-range communication unit 112. The mobile terminal 500 may transfer, for example, the web server address, the mail address and the network address to the controller 200 via a base station or a network Netw.
Utilizing the above-mentioned web server, mail address, etc., the communication controller 203 can obtain information associated with, for example, a program.
When content, an application or game software is transferred from the mobile terminal 500, the communication controller 203 included in the controller 200 operates.
The communication controller 203 stores the received content in the memory 213. The content may be stored in the storage unit 111 in accordance with an operation command or automatically. In the storage unit 111, the received content is recorded in, for example, a hard disk. In the hard disk, the content is managed as a content file.
A display menu video signal, a program table signal, etc., are controlled by the display controller 211. When a menu or a program table is displayed, menu screen data or the program table signal is read from the OSD block 212 and sent to the video output unit 133 under the control of the display controller 211. As a result, the menu or the program table is displayed on the display 134. The menu screen data or program table signal may be read from a data storage unit (memory or hard disk) under the control of the display controller 211.
The display menu video signal, the program table signal, etc., can also be transmitted to the mobile terminal 500. When the mobile terminal 500 has requested the menu video signal, the program table signal, etc., the display controller 211 can transmit the menu video signal, the program table signal, etc., to the mobile terminal 500.
The mobile terminal 500 can display the menu video signal and the program table video signal on a touch panel screen. By touching a button displayed on the touch (pointer) panel screen, a user can supply an operation instructing signal to the electronic device.
The controller 200 further comprises a processor 300 (that has a function of generating a program list, and a storage control function of storing, in the memory 213, program information associated with preference information).
The processor 330 stores, in the memory 213, program information extracted from program table information and indicative of a plurality of programs associated with a predetermined preference. For instance, the processor 330 rearranges programs to be broadcasted in a decreasing order of user preference degree, based on EPG data and predetermined preference information, and stores information indicative of the order of program preference in the memory 213. The predetermined preference information is information associated with, for example, a user's program viewing history, an external information search history, a shopping history, a history of communication using, for example, email, and text or images uploaded to the Internet by the user. The EPG data may be extracted from a TS obtained by a tuner, or be obtained from an external server via a network. The information stored in the memory 213 may be updated, for example, when a broadcast program is being viewed, or whenever a recorded program has been replayed. Further, the information may be updated regularly and automatically, or manually in accordance with a user's instruction.
Referring now to
The chromaticity diagram coordinate calculator 2 calculates two-dimensional coordinates on a chromaticity diagram corresponding to an input video signal. The deviation determining unit 3 determines whether the two-dimensional coordinates calculated by the chromaticity diagram coordinate calculator 2 fall within or outside a preset BT.709 color gamut on a color gamut diagram. The color gamut determining unit 4 determines whether the color gamut corresponding to the input video signal is a predetermined color gamut, based on the determination result of the deviation determining unit 3. The color gamut increasing unit 5 adjusts a color gamut increasing method to perform color gamut increase processing on the input video signal, based on the determination result of the color gamut determining unit 4.
Referring then to
In
Np=wh (1)
Further, one pixel has three components (pixel values), and is expressed using three components of Y, Cb and Cr, or of R, G and B. Each component is expressed by a digital signal with an accuracy of about 8 bits to 16 bits.
After determination as to the input of a video signal in step ST1, the program proceeds to step ST2. In step ST2, the chromaticity diagram coordinate calculator 2 calculates the coordinates, on the chromaticity diagram, of each pixel of the input video signal, the coordinates corresponding to the pixel values. Chromaticity diagrams include CIExy chromaticity diagram, a UCS chromaticity diagram, etc. In the embodiment, a chromaticity diagram, in which the color gamut is expressed by the internal area of a triangle defined using three primary colors (RGB) as vertexes, is used for calculating coordinates thereon.
The term “color gamut” means a color range that can be expressed, and the color gamut differs among different input/output devices and different standards of video signals. As a typical standard color gamut, there is a so-called narrow gamut of BT.709 (more specifically, ITU-R BT.709-3). The triangular range A indicated by the broken line in
A video signal obtained by a camera of a color gamut equal to in scale or narrower than the BT.709 gamut is directly transmitted and recorded as a BT.709 signal. However, the natural world also contains colors outside the BT.709 gamut, and if these colors are photographed, they are recorded not as achroma but as some colors. This can be considered because the color gamut is recorded, compressed. Further, in a video image obtained by a camera having a wider color gamut than BT.709, the color gamut is compressed and recorded by a method unique to the camera system itself.
C in
In addition, when a video image obtained by the wide color gamut camera is transferred as a video signal corresponding to a video signal standard (e.g., BT.2020) having a wider color gamut than BT.709, it is not compressed but is directly transferred and recorded. Similarly, a video image recorded based on BT.709 has its color range increased, whereby it is converted into a video signal having a wide color range (i.e., into a color gamut increased signal), and is transferred and recorded.
After chromaticity diagram coordinate calculation is finished in step ST2, the program proceeds to step ST3. In step ST3, it is determined whether the input video signal falls within or outside a narrow gamut, such as BT.709, in the deviation determining unit 3.
Whether within or outside the color gamut of BT.709, it can be determined from inequalities that use the coordinates (xi, yi) of each pixel value on the gamut diagram, and equations corresponding to three straight lines (RG, GB, BR). Assuming that the coordinates of R, G and B are (xR, yRR), (xG, yG) and (xB, yB), the straight line passing points R and G is expressed by coordinates (x, y) that satisfy the following equations (2) to (4). Accordingly, colors (xi, yi) outside the color gamut satisfy the following inequality (5), and the colors (xi, yi) within the color gamut satisfy the following inequality (6).
where the inequality (5) is a sufficient condition required for the colors (xi, yi) to fall outside the color gamut, and the inequality (6) is a necessary condition required for the colors (xi, yi) to fall within the color gamut. Note that the inequalities (5) and (6) are associated with a case where y assumes a lower value with respect to the straight lines expressed by (5) and (6) falls within the color gamut. In contrast, if y assumes a higher value within the triangle defined by the straight lines, the inequality signs are inversed.
Similarly, with respect to the straight line passing through points G and B, the following arithmetic expressions (7) to (11) are used to discriminate the inside and outside of the color gamut. With respect to the straight line passing through points B and R, the following arithmetic expressions (12) to (16) are used to discriminate the inside and outside of the color gamut. In conclusion, if any one of the inequalities (5), (10) and (15) is satisfied, the point (xi, yi) falls outside the color gamut, and if not, the point (xi, yi) falls within the same. In other words, all of the inequalities (6), (11) and (16) are satisfied, the point (xi, yi) falls outside the color gamut, and if not, the point (xi, yi) falls within the same.
As described above, in step ST3, it is determined whether the point (xi, yi) obtained by mapping each pixel value of a video signal on the chromaticity diagram falls within or outside the color gamut, thereby calculating the chromaticity of the point, and transmitting information indicative of the calculation result to the color gamut determining unit 4, followed by the program proceeding to step ST4.
In step ST4, the color gamut corresponding to the input video signal is determined in the color gamut determining unit 4, using the calculation result information received from the deviation determining unit 3. Assuming that the chromaticity indicated by the pixel value determined to fall within a narrow color gamut is calculated and set as nin, and the chromaticity indicated by the pixel value determined to fall outside the narrow color gamut is calculated and set as nout, if nout is greater than θa (see
After determining the color gamut in step ST4, the program proceeds to step ST5. In step ST5, the color gamut increasing unit 5 receives the result of color gamut determination from the color gamut determining unit 4. If the answer is Yes, the program proceeds to step ST6, while if the answer is No, the program proceeds to step ST7.
In step ST6, the color gamut increasing unit 5 performs gamut increase processing on each narrow-gamut (e.g., BT.709 gamut) pixel included in the video signal to thereby create a wide-gamut (e.g., BT.2020 gamut) pixel, followed by the program proceeding to step ST7.
In step ST7, the color gamut increasing unit 5 outputs a pixel-processed video signal, followed by the program proceeding to step ST8.
In step ST8, it is determined whether input of the video signal has finished. If the input has finished (Yes in step ST8), the processing is finished, whereas if it has not yet finished (No in step ST8), the program returns to step ST1 to thereby iterate the above-mentioned steps ST2 to ST7.
By virtue of the above processing, in the first embodiment, even when the attribute information of a video signal includes no original color gamut information, the original color gamut is determined from each pixel value of the input video signal, and color gamut increase processing is performed adaptively based on the determination result. As a result, display can be performed with preferable colors faithful to materials.
Referring now to
The color histogram counter 6 receives a video signal having its chromaticity diagram coordinates calculated by the chromaticity coordinate calculator 2. The color histogram counter 6 determines within which one of the corresponding color ranges (bins) included in predetermined color bins, the color indicated by each pixel value of the input video signal falls, adds the chromaticity of the determined bin, and outputs addition results of the respective bins as a color histogram to the color gamut determining unit 4.
Referring then to
The procedure shown in
More specifically, in step ST19, the color histogram counter 6 determines within which one of bins corresponding to predetermined color ranges, the color indicated by each pixel value of a video signal falls, and adds the chromaticity of the determined bin. In
In step ST4, the color gamut determining unit 4 determines the color gamut corresponding to an input video signal, using count result information received from the color histogram counter 6. In the second embodiment, assuming that Bi is the ith bin, Cin is a set of bins within a color gamut, and Cout is a set of bins outside the color gamut, the sum of the chromaticities ni of the bins belonging to Cinn within a narrow color gamut is counted and set as nin (see the following equation (17)), and the sum of the chromaticities ni of the bins belonging to Cout outside the narrow color gamut is counted and set as nout (see the following equation (18)).
Alternatively, in order to smoothly shift the determination result in the vicinity of the boundary of the color gamut, the ratio of bins i within the gamut may be set as wiin, and the ratio of bins i outside the gamut be set as wiout, thereby calculating nin and nout, using the following equations (19) to (21):
If nout is greater than θa (see
By virtue of the above processing procedure, also in the second embodiment, even when the attribute information of a video signal includes no original color gamut information, the original color gamut is determined from each pixel value of the input video signal, and color gamut increase processing is performed adaptively based on the determination result. As a result, display can be performed with preferable colors faithful to materials.
In
A third embodiment is obtained by adding, to the first embodiment, processing of performing scene change detection of an input video signal to change color gamut increase processing at appropriate timing, based on the detection result. In this embodiment, no detailed description will be given of the elements that perform the same processing as in the first embodiment.
The video color gamut correcting unit 1 of
In
In step ST11, the color gamut determination unit 4 sets “enlargement” as a recommended mode if the color gamut determination result indicates a narrow color gamut, and sets “No conversion” as the recommended mode if the color gamut determination result indicates a wide color gamut. The set recommended mode is reported to the enlargement mode recorder 7. After that, the program proceeds to step ST12.
In step ST12, the scene change detector 8 detects whether a scene change associated with the input video signal has occurred, and reports the determination result to the color gamut increasing unit 5. If a scene change has been detected, the program proceeds to step ST13, while if no scene change has been detected, the program proceeds to step ST15.
The determination as to the above-mentioned scene change is performed in the following way: A statistics value, such as an average luminance value or a luminance variance, is calculated for each image frame, and the distance in statistics value between two subsequent image frames is calculated. If the distance had exceeded a preset threshold, it is determined that a scene change has occurred. For instance, if the following inequality is satisfied, it is determined that a scene change has occurred.
(Y1−Y2)2+(s1−s2)2×θ
where Y1 is the average luminance value of a first image frame, s1 is a variance associated with the first image frame, Y2 is the average luminance value of a first image frame, s2 is a variance associated with the second image frame, and θ is a threshold.
There is another method, in which the pixel values at preset particular coordinate pairs can be directly used as statistic values. Further, in the case of a color image, pixel values (luminance Y and color differences (U, V)) can be used as three-dimensional values. Yet further, a color histogram is calculated from pixel values, and a scene change can be detected from differences in the thus-calculated color histogram.
In step ST13, the color gamut increasing unit 5 compares the conversion mode with the recommended mode stored in the enlargement mode recorder 7. If they differ from each other, the recommended mode is substituted for the conversion mode (step ST14), and the program proceeds to step ST15. In contrast, if the conversion mode (value) is equal to the recommended mode (value), the program directly proceeds to step ST15.
In step ST15, the color gamut increasing unit 5 checks the content of the conversion mode. If it is “enlargement,” the program proceeds to step ST6, while if it is “No conversion,” the program proceeds to step ST7.
In step ST6, the color gamut increasing unit 5 performs color gamut increase processing on the input video signal, followed by the program proceeding to step ST7.
In the third embodiment, since a scene change in an input video signal is detected, and the timing of change in color gamut processing is controlled based on the detection result, a sense of visual discomfort due to the change of color gamut processing can be reduced.
As described above, in color gamut increase processing performed in the embodiments, appropriate color gamut increase processing is adaptively performed in a real time on both a video image created in a narrow color gamut and a video image created in a wide color gamut, thereby outputting video images faithful to materials, exhibiting clean colors, and imparting natural impression.
The above-described embodiments can also be implemented as video display apparatuses including a wide-color-gamut compliant display.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
This application claims the benefit of U.S. Provisional Application No. 62/043,949, filed Aug. 29, 2014, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62043949 | Aug 2014 | US |