This application relates to and claims priority from Japanese Patent Application No. 2010-176925 filed on Aug. 6, 2010, the entire disclosure of which is incorporated herein by reference.
The present invention relates to a broadcast receiving apparatus of three-dimensional (hereinafter, 3D) video or picture, a receiving method, and a transmitting method, as well.
In the Patent Document 1 is described, “for displaying suitable 2D picture or 3D picture, by taking fatigue of eyes into the consideration thereof” as the problem(s) to be dissolved, and as the dissolving means thereof, “the picture generated by the an output T2 of a 2D/3D converting means comes to be the picture L31 shown in 1-3. The picture L31 is such 3D picture that the pictures for the left-side eye and the right-side eye are produced upon basis of the T1 signal and are aligned. When 2D is selected through a 2D/3D manual switching, 2-6 picture is supplied to a stereographic picture display means, and the picture L51-2 of 2-9 is displayed. When 3D is selected through the 2D/3D manual switching, the picture of 2-7 is supplied to the three-dimensional picture display means, and the picture L51-3 of 2-10 is displayed. Since 2-7 is 3-dimensional picture, and a 2D/3D exchange control means controls the stereographic picture display means, so as to obtain 3-D picture display, therefore it results into 3-D picture display. As was mentioned above, the control is done in such a manner, that the T2 signal not suitable for display can be avoided from being selected.” and so on.
However, in the Patent Document 1 is no disclosure about a process at a time-point when the video signal is exchanged or switched, such as, exchange of a program, etc., and therefore having a problem that there are cases where an appropriate display of the picture cannot be achieved, depending on situations.
For dissolving such the problem as mentioned above, according to one embodiment of the present invention, it is sufficient to apply the technical idea or concept, which is described in the claims, for example.
According to such means as mentioned above, it is possible to output the most suitable picture for a user, in the case where the switching is generated among various video signals, such as, 2D and 3D, and as a result thereof, it is possible to increase usability for the user.
Those and other objects, features and advantages of the present invention will become more readily apparent from the following detailed description when taken in conjunction with the accompanying drawings wherein:
Hereinafter, embodiments according to the present invention will be fully explained by referring to the attached drawings. However, the present invention should not be limited to the embodiments. In the embodiments, explanation will be given, mainly on a receiving apparatus, and therefore suitable for implementation in the receiving apparatus; however, this does not prevent the application to those other than the receiving apparatus. Also, the constituent elements of the embodiments are not always necessary to be applied, but can be applied, selectively.
Reference numeral 1 depicts a transmitting apparatus, which is installed in an information providing station, such as, a broadcast station, 2 a relay apparatus, which is provided in a relay station or a broadcast satellite, etc., 3 a public network connecting between an ordinary home or house and the broadcast station, such as, Internet, etc., 4 a receiving apparatus, which is provided within a house of user, and 10 a receiving/recording/reproducing unit, which is built in the receiving apparatus 4, respectively. Within the receiving/recording/reproducing unit 10, the information broadcasted can be recorded or reproduced, or content from a removable external medium can be reproduced, etc.
The transmitting apparatus 1 transmits signal waves, which are modulated through the relay apparatus 2. Other than the transmission through a satellite, as is shown in the figure, it is possible to apply the transmission by a cable, the transmission by a telephone line, the transmission by a terrestrial broadcast wave, and transmission through a network, such as, Internet through the public network 3, etc. This signal wave received by the receiving apparatus 4, as will be mentioned later, after demodulated into an information signal, is recorded on a recording medium, depending on necessity thereof. Or, when transmitting it through the public network 3, it is converted into a format, such as, a data format (an IP packet), etc., according to a protocol suitable for the public network 3 (for example, TCP/IP), while the receiving apparatus 4 receiving the data mentioned above decodes it into the information signal, and it is recorded on the recoding medium, to be a signal suitable for recording depending on necessity thereof. Also, the user can view/listen video/audio shown by the information signals on a display, when this display is built in within the receiving apparatus 4, or by connecting a display not shown in the figure with the receiving apparatus 4, when it is not built therein.
Reference numeral 11 depicts a source generator unit, 12 an encoder unit for conducting compression according to MPEG 2 or H.264 method, etc., thereby adding program information, etc., 13 a scramble unit, 14 a modulator unit, 15 a transmission antenna, and 16 a management information attachment unit, respectively. Upon the information, which is generated by the source generator unit 11 composed of a camera, a recording/reproducing apparatus, etc., is treated the compression of date amount or volume, within the encoder unit 12, so that it can transmitted with occupying a less band. After being modulated to be a signal suitable to be transmitted, such as, OFDM, TC8PSK, QPSK, multi-value QAM, etc., in the modulator unit 14, it is transmitted as a airwave directing to the relay apparatus 2, from the transmission antenna 15. In this instance, in the management information attachment unit 16, it is attached with program identifying information, such as, an attribute, etc., of the content produced in the source generator unit 11 (for example, coding information of video and/or audio, coding information of audio, structure of program, 3D picture or not, etc.), and also attached with program alignment information produced by the broadcast station, etc. (for example, the structure of the present program and next programs, a method or form of service, structure information of the programs for one week, etc.) Those program identifying information and program alignment information are called, collectively in combination, “program information” hereinafter.
However, there are many cases where plural numbers of information are multiplexed with using a manner, such as, time division, a spread spectram, etc. Though not shown in
Also, similar to the signal to be transmitted through the public network 3, the signal produced in the encoder unit 12 is encoded within the encoder unit 17, to be visible/audible for a specific viewer. After being encoded to be a signal suitable for transmission through the public network 3 in a communication path coding unit 18, it is transmitted directing to the public network 3, from a network I/F (Interface) unit 19.
As the transmission method of 3D program to be transmitted from the transmitting apparatus 1, there are two (2) methods, roughly diving it. One of the methods is a method of accepting the pictures for the left-side eye and the right-side eye within one (1) piece of a screen, with utilizing the existing broadcast method for 2D program. In this method is applied the existing MPEG 2 (Moving Picture Expert Group 2) or H.264 AVC, and the feature thereof lies in that compatibility can be obtained between the existing broadcast, and therefore is able to use the existing relaying infrastructure, and enables receipt by the existing receiver (STB, etc.); however, this results to transmit the 3D picture having a half of the maximum resolution of the existing broadcast (the vertical direction, or the horizontal direction). For example, as is shown in
As other method, there is a method of transmitting the picture for the left-side eye and the picture for the right-side eye, respectively, on separated Elementary stream (ES). In the present embodiment, hereinafter, that method is called “2 viewpoint separate ES transmission”. As an example of this method, for example, a transmission method by means of H.264 MVC, being the multi-aspect video coding method. The feature thereof lies in that the 3D picture of high resolution can be transmitted. With using this method, there can be obtained an effect that the 3 D picture of high resolution can be transmitted. However, the multi-aspect video coding method is a coding method standardized for encoding the multi-aspect picture, wherein multi-aspect pictures can be encoded, but without dividing one (1) screen for each aspect, and thereby encoding separate screen for each aspect.
When transmitting the 3D picture with this method, while determining an encoded picture of an aspect for the left-side eye, as a main aspect screen, it is sufficient to transmit an encoded screen for the left-side eye, as other aspect picture, for example. With doing this, about the main aspect picture, it is possible to maintain a compatibility with the existing 2D program broadcasting method. For example, when applying the H.264 MVC as the multi-aspect picture coding method, about a base sub-stream of the H.264 MVC, the main aspect picture can maintain the compatibility with the 2D picture of the H.264 MVC, and thereby enabling to display the main aspect picture as the 2D picture.
Further, in the embodiment of the present invention, as other methods of “3D 2-aspect separate ES transmission method” are included the following methods, too.
As one other example of “3D 2-aspect separate ES transmission method” is included a method of encoding a picture for the left-side eye as a main aspect picture, by the MPEG 2, while encoding a picture for the right-side eye as the other aspect picture, by the H.264 AVC; thereby obtaining separate streams, respectively. With this method, since the main aspect picture is MPEG 2 compatible or operable, and can be displayed as the 2D picture, it is possible to maintain the compatibility with the existing 2D program broadcasting method, in which pictures encoded by the MPEG 2 are widely spread.
As other example of “3D 2-aspect separate ES transmission method” is included a method of encoding the picture for the left-side eye as the main aspect picture, by the MPEG 2, and encoding the picture for the right-side eye as the other aspect picture, by the MPEG 2, thereby obtaining separate streams. With this method, since the main aspect picture is MPEG 2 compatible or operable and can be displayed as the 2D picture, it is possible to maintain the compatibility with the existing 2D program broadcasting method, in which pictures encoded by the MPEG 2 are widely spread.
As further other example of “3D 2-aspect separate ES transmission method” may be included one of encoding the picture for the left-side eye as the main aspect picture, by the H.264 AVC or the H264 MVC, while encoding the picture for the right-side eye as the other aspect picture, by the MPEG 2.
Further, separating from “3D 2-aspect separate ES transmission method”, even with a coding method, such as, the MPEG 2 or the H264 AVC (except for MVC), etc., but not the coding method regulated as the multi-aspect video coding method, originally, it is also possible to produce a stream storing the picture for the left-side eye and a frame for the right-side eye, alternately, and thereby enabling 3D transmission.
Program identify information and program alignment information are called program information, collectively.
The program identify information is also called, PSI (Program Specific Information), and it is the information necessary for selecting a desired program, i.e., it is made of four (4) tables, including PAT (Program Association Table) for designating a packet identifier of a TS packet for transmitting PMT (Program Map Table) relating to broadcast programs, PMT for designating a packet identifier of TS packet for transmitting each coded signals building up the broadcast programs, NIT (Network Information Table) for transmitting information relating to transmission path, such as, a modulation frequency, etc., and information relating to the broadcast programs, and CAT (Conditional Access Table) for designating a packet identifier of TS packet for transmitting individual or particular information among relating information of pay broadcasts, and they are regulated by a system regulation of MPEG 2. For example, it includes the encoded information of video, the encoded information of audio, and the structures of programs. According to the present invention, there is added the information indicative of whether being the 3D picture or not, newly. That PSI is added within the management information attachment unit 16.
The program alignment information is also called, SI (Service Information), including various types of information defined for the purpose of convenience of program selection, and there is included PSI information of MPEG-2 system regulation, and there are followings: EIT (Event Information Table) describing information relating to programs therein, such as, a program title, broadcast date/time, broadcast content, etc., for example, SDT (Service Description Table) describing information relating to composition of channels (or services) therein, such as, a composite channel name, a name of broadcast undertaker, etc.
For example, it includes composition of a program presently broadcasted and/or a program to be broadcasted next, a form of service, or information indicating composite information of programs for 1 week, and is added within the management information attachment unit 16.
In the broadcast information are included a component descriptor, a component group descriptor, a 3D program detail descriptor, a service descriptor and a service list descriptor, etc., being a constituent elements of the broadcast information. Those descriptors are described in the tables, such as, PMT, EIT [schedule basis/schedule extended/present/following], NIT, SDT, for example.
As a way of using of the respective tables, PMT and EIT, for example, about PMT, since it is only for describing the information of the program present broadcasted, it is impossible to confirm the information of the program, which will be broadcasted in future. However, since a transmission cycle from a transmitter side is short, the time until completion of the transmission is also short, and therefore it has a feature of being high in reliability in the meaning that there is no change because of the information of the program broadcasted at present. On the other hand, about the EIT [schedule basis/schedule extended], although information of programs up to 7 days ahead can be obtained, other than the program broadcasted at present; however, from meanings that the time until completion of receipt is long since the transmission cycle from the transmitter side is long comparing to PMT, that it needs a lot of memory regions for holding, and that there is possibility of being changed because of being a phenomenon of future, therefore, it has a demerit that a reliability is low, or the like. About EIT [following], it is possible to obtain information of the programs of the next broadcast time.
PMT of the program identify information, with using the table structures defined in ISO/IEC 13818-1, is able to show a format of ES of the program under broadcast, by “stream_type” (stream format type), i.e., the information of 8 bits, which is described in a second loop thereof (a loop for each ES (Elementary Stream)). In an embodiment of the present invention, with increasing the formats of ES more than that of the conventional one, the formats of ES of the programs to be broadcasted are assigned, as is shown in
First of all, in relation to a base-view bit stream (main aspect) of the multi-aspect video coding (example: H.264/MVC) stream, “0x1B” is assigned, which is same to AVC video stream, which is defined by the existing ITU-T recommendation H.264|ISO/TEC 14496-10 picture. Next, sub-bit stream (other aspect) of the multi-aspect video encoded stream (for example, H.264 MVC), which can be used in the 3D picture program, is assigned to “0x20”.
Also, in relation to a base-view bit stream (main aspect) of H.262 (MPEG 2) method, when it is applied in “3D 2-aspect separate ES transmission method” for transmitting multi-aspects of 3D picture as a separate stream, “0x02” is assigned, which is same to the existing ITU-T recommendation H.264|ISO/TEC 113818-2 video. Herein, the base-view bit stream (main aspect) of H.262 (MPEG 2) method of the case when transmitting the multi-aspects of 3D picture as a separate stream, is a stream, only the main aspect picture of which is encoded with of H.262 (MPEG 2) method, among the multi-aspect pictures of the 3D pictures.
Further, to “0x21” is assigned a bit stream of other aspect of H.262 (MPEG 2) method of the case when transmitting the multi-aspects of the 3D picture as the separate stream.
Further, to “0x22” is assigned bit stream of other aspect bit stream of AVC stream format, which is defined by ITU-T recommendation H.264|ISO/IEC 14496-10 video of the case when transmitting the multi-aspects of the 3D picture as separate stream.
However, in the explanation given herein, the sub-bit stream of multi-aspects video coding stream, which can be used in the 3D picture program, is assigned to “0x20”, the other-aspects bit stream of H.262 (MPEG 2) method of the case when transmitting the multi-aspects of the 3D picture as separate stream is assigned to “0x21”, and AVC stream defined by ITU-T recommendation H.264|ISO/IEC 14496-10 picture of the case when transmitting the multi-aspects of the 3D picture as separate stream is assigned to “0x22”; however, they may be assigned to any one of “0x23” to “0x7E”. Also, the MVC video stream is only one example, but it may be a video stream other than H.264/MVC, if indicating multi-aspects encoded video steam, which can be used in the 3D picture program.
As was mentioned above, by assigning bits of “stream_type” (stream format type), according to the embodiment of the present invention, it is possible to transmit the 3D program with a combination of streams, as shown in
In a combination example 1, as the main-aspect (for the left-side eye) video stream is transmitted a base-view sub-bit stream (main aspect) (stream format type “0x1B”) of the multi-aspects video encoded (example: H.264/MVC) stream, and as a sub-aspect (for the right-side eye) video stream is transmitted a sub-bit stream for use of other-aspect (stream format type “0x20”) of the multi-aspects video encoded (example: H.264/MVC) stream.
In this case, for both the main-aspect (for the left-side eye) video stream and the sub-aspect (for the right-side eye) video stream, there are used streams of the multi-aspects video encoding (example: H.264/MVC) method. The multi-aspects video encoding (example: H.264/MVC) method is, originally, a method for transmitting a multi-aspects picture, and under the combination shown in
Also, when executing 3D display (output) of the 3D program, the receiving apparatus processes both the main-aspect (for the left-side eye) video stream and the sub-aspect (for the right-side eye) video stream, and thereafter, is able to reproduce the 3D program.
When the receiving apparatus executes the 2D display (output) of the 3D program, it processes only the main-aspect (for the left-side eye) video stream, and thereafter, is able to display (output) it as the 2D program.
Further, because of compatibility between the base-view sub-bit stream of the multi-aspect coding method H.264/MVC and the existing video stream of H.264/AVC (except for MVC), there can be obtained the following effect, by assigning the stream format type of both to the same “0x1B”, as is shown in
Further, since to the sub-aspect (for the right-side eye) video stream is assigned the stream format type, which is not provided conventionally, it is neglected within the existing receiving apparatus. With this, it is possible to prevent the sub-aspect (for the right-side eye) video stream from being displayed (outputted), as is not intended by the broadcast station side.
Therefore, even if starting the broadcast of 3D program of the combination example 1, newly, it is possible to avoid a situation where it cannot be displayed (outputted) on the existing receiving apparatus, having the function of displaying (outputting) the video stream of the existing H.264/AVC (except for MVC). With this, If starting such 3D program broadcast, newly, in the broadcast managed by an income of advertisement, such as, CM (commercial message), since it can be viewed/listened on the receiving apparatus, which is not enabled to the 3D display (output) function, it is possible to avoid an audience rating from being reduced due to the limit of function of the receiving apparatus; i.e., there is a merit on the broadcast station side.
In a combination example 2, as the main-aspect (for the left-side eye) video stream is transmitted a base-view dot stream (main-aspect) (stream format type “0x02”) of H.262 (MPEG 2) of the case when transmitting the multi-aspects of 3D picture, and as a sub-aspect (for the right-side eye) video stream is transmitted an AVC stream (stream format type “0x02”), which is defined by ITU-T recommendation H.264|ISO/IEC 14496-10 video of the case when transmitting the multi-aspects of 3D picture as a separate stream.
Similar to the combination example 1, when executing 3D display (output) of the 3D program, the receiving apparatus processes both the main-aspect (for the left-side eye) video stream and the sub-aspect (for the right-side eye) video stream, and thereafter, is able to reproduce the 3D program, and also when the receiving apparatus executes the 2D display (output) of the 3D program, it processes only the main-aspect (for the left-side eye) video stream, and thereafter, is able to display (output) it as the 2D program.
Further, adapting the base-view bit stream (main aspect) of H.262 (MPEG 2) of the case when transmitting the multi-aspects of 3D picture as a separate stream, to the stream compatible or operable with the existing ITU-T recommendation H.262|ISO/IEC 13818-2 video stream, and also, as is shown in
Also, similar to the combination example 1, since to the sub-aspect (for the right-side eye) video stream is assigned the stream format type, which is not provided conventionally, it is neglected within the existing receiving apparatus. With this, it is possible to prevent the sub-aspect (for the right-side eye) video stream from being displayed (outputted), as is not intended by the broadcast station side.
Since the receiving apparatus having a function of display (output) with the existing ITU-T recommendation H.262|ISO/IEC 13818-2 video stream is widely spread, it is possible to prevent an audience rating from being reduced, much more, due to the limit of function of the receiving apparatus, and thereby achieving the most suitable broadcasting for the broadcast station.
Further, adapting the sub-aspect (for the right-side eye) video stream to AVC stream (stream format type “0x22”) defined by the existing ITU-T recommendation H.264|ISO/IEC 14496-10 video, it is possible to transmit the sub-aspect (for the right-side eye) video stream with high compression rate.
Namely, with the combination example 2, it is possible to achieve compatibility of a commercial merit of the broadcast station and a technical merit due to high-efficiency transmission.
In a combination example 3, as the main-aspect (for the left-side eye) video stream is transmitted a base-view dot stream (main-aspect) (stream format type “0x02”) of H.262 (MPEG 2) of the case when transmitting the multi-aspects of 3D picture, and as a sub-aspect (for the right-side eye) video stream is transmitted the bit stream of other aspect (stream format type “0x21” of H.262 (MPEG 2) method of when transmitting the multi-aspects of 3D video as a separate stream.
In this case, also similar to the combination example 1, it is possible to display (output) it as the 2D program, on the receiving apparatus having the function of displaying (outputting) the existing ITU-T recommendation H.262|ISO/IEC 13818-2 video stream, if not having the 3D display (output) function.
In addition to the commercial merit for the broadcast station, i.e., preventing the audience rating from being lowered down due to the restriction of the function of the receiving apparatus, with unification of the coding methods of both the main-aspect (for the left-side eye) video stream and the sub-aspect (for the right-side eye) video stream into H.262 (MPEG 2) method, it is possible to simplify the hardware construction of the video decoding function in the receiving apparatus.
Further, as in a combination example 4, it is also possible to transmit the base-view sub-bit stream (main aspect) (stream format type “0x1B”) of the multi-aspect video encoded (example: H.264/MVC), as the main-aspect (for the left-side eye) video stream, and transmit the bit stream of other aspect (stream format type “0x21) of H.264 (MPEG 2) of the case when transmitting the multi-aspects of 3D video as a separate stream, as the sub-aspect (for the right-side eye) video stream.
However, in the combination shown in
Also, in the combination shown in
Meaning of the component descriptor is as follows. Thus, “descriptor_tag” is a field of 8 bits, describing a value therein, with which this descriptor can be discriminated to be a component descriptor. “descriptor_length” is also a field of 8 bits, describing a size of this descriptor therein. “stream_component” (contents of component) is a filed of 4 bits, presenting a type of the stream (video, audio, data), and it is encoded in the structure shown in
In a program map section, a value of the component tag to be given to each stream should be different one. The component tag is a label for discriminating or identifying the component stream, having the same value to that of the component tag within a stream ID descriptor (but, when the stream ID descriptor is within PMT). A field of 24 bits of “ISO—639_language code (language code) discriminates the language of component (audio or data), and also discriminates the language of characters described, which are included in this descriptor.
A language code is expressed by 3-character code, which is defined in ISO 639-2 (22). Each character is encoded by 8 bits, in accordance with ISO8859-1 (24), and is inserted into a field 24 bits, in that sequence. For example, Japanese language is “jpn” by 3 alphabetic characters, and is encoded as follows: “0110 1010 0111 0000 0110 1110”. “text_char” (component description) is a field of 8 bits. The field of a series of component description defines description of characters of the component stream.
“0x05” of the component content shown in
“0x07” of the component content shown in
“0x07” of the component content shown in
“0x08” of the component content shown in
As is shown in
In particular, when transmitting the 3D picture program including videos of plural numbers of aspects within one (1) picture of, such as, the “Side-by-Side” method or the “Top-and-Bottom” method, with using a coding method, such as, MPEG 2 and H.264 AVC (except for MVC), not the coding method, which is defined, originally, as the multi-aspect video coding method, it is difficult to discriminate or identify if the transmission is made by including the pictures of plural number of aspects within one picture for use of the 3D picture program, or is an ordinary picture of one (1) aspect, only with using “stream_type” (stream format type) mentioned above. Therefore, in this case, discrimination or identification of various video methods, including discrimination that the corresponding program is 2D program/3D program, may be made by the combination of “stream_content” (component content) and “component_type” (component type). Also, upon basis the fact that the component descriptor is distributed by EIT, relating to the program, which is broadcasted at present or will be broadcasted in future, EPG (program list) is produced by obtaining EIT in the receiving apparatus 4, and thereby it is possible to produce if it is the 3D picture or not, the method of the 3D picture, the resolution, the aspect ratio, as the information of EPG. The receiving apparatus has a merit of enabling to display (output) those information on EPG.
As was mentioned above, upon the fact that the receiving apparatus 4 can observe “stream_content” and “component_type”, there can be obtained an effect of enabling to recognize the program, which is received at present or will be received in future, to be the 3D program or not.
Meaning of the component group descriptor is as follows. Thus, “descriptor_tag” is a field of 8 bits, for describing a value therein, with which this descriptor can be discriminated to be the component group descriptor. “descriptor_length” is a field of 8 bits, for describing a size of this descriptor therein. “component_group_type” (component group type) is a field of 3 bits, for indicating a group type of the component.
Herein, “001” indicates 3DTV service, and it can be distinguished from a multi-view TV service of “000”. Herein, the multi-view TV service is a TV service for enabling to display the 2D pictures of plural numbers of aspects, switching them for each aspect thereof. For example, in the multi-aspect video encoded video stream, or in the stream of coding method, originally, not being the coding method, which is defined as the multi-aspect video coding method, there may be cases where the stream of the case when transmitting, including the pictures of plural numbers of aspects within one (1) picture, is applied, not only the 3D video program, but also a multi-view TV program. In this case, if the multi-aspects pictures are included in the stream, there are also cases where discrimination cannot be made on whether it is the 3D video program or the multi-view TV program, only by the “stream_type” (stream format type) mentioned above. In such cases, discrimination with using “component_group_type” (component group type) is effective. “total_bit_rate_flag” is a flag of one (1) bit, and this indicates a condition of describing a total bit rate within the component group in an event. When this bit is “0”, it indicates that the total bit rate field within the component group does not exist in that descriptor. When this bit is “1”, it indicates that the total bit rate field within the component group exists in that descriptor. “num_of_group” (group number) is a field of 4 bits, and this indicates a number of the component groups in the event.
“component_group_id” (component group indemnification) is a filed of 4 bits, describing the component group identification, in accordance with that shown in
“num_of_component” (component number) is a field of 4 bit, and this indicates a number of the component (s), belonging to that component group and further to the accounting/unaccounting units indicated by “CA_unit_id” just before. “component_tag” (component tag) is a field of 8 bits, and this indicates a component tag value, belonging to the component group.
“total_bit_rate” (total bit rate) is a field of 8 bits, for describing the total bit rate of the components within the component group therein, raising a transmission rate of the transport stream packet to a unit per ¼ Mbps. “text_length” (component group description length) is a field of 8 bits, and this indicates the byte length of the component group description following thereto. “text_char” (component group description) is a field of 8 bits. In a series of character information fields, there are described the explanation relating to the component group.
As was mentioned above, the receiving apparatus 4 can observe “component_group_type”, and thereby obtaining an effect of enabling to recognize the program, which is received at present or will be received in future, is the 3D program.
Next, explanation will be given on an example of using a new descriptor for indicating information relating to the 3D program.
“3d—2d_type” (3D/2D type) is a field of 8 bits, and this indicates type of 3D picture/2D picture in the 3D program. This field is 3D picture, for example, in a main component, and of the 3D program constructed with the 2D picture, in a commercial program inserted in the program on the way thereof, it is the information for identifying on whether it is the 3D picture or the 2D picture; i.e., it is disposed for the purpose of preventing a malfunction in the receiving apparatus (i.e., a problem of display (output) generated due to the fact that the broadcast program is the 2D picture, although the receiving apparatus is executing the 3D process). “0x01” indicates the 3D picture, and “0x02” indicates the 2D picture, respectively.
“3d_method_type” (3D method type) is a field of 8 bits, and this shows the method of 3D. “0x01” indicates ““3D 2-aspect separate ES transmission method”, “0x02” the Side-by-Side method, and“0x03” the Top-and-Bottom method, respectively. “stream_type” (stream format type) is a field of 8 bits, and this indicates a format of ES of the program, in accordance with that shown in
However, it is also possible to apply the structure of transmitting the program descriptor when the program is of the 3D picture, but not when it is the 2D picture program. Upon only presence/absence of transmission of the 3D program details descriptor, it is possible to discriminate whether the corresponding program is the 2D video program or the 3D video program.
“component_tag” (component tag) is a field of 8 bits. The component stream of service is able, because of this 8 bits field, to refer the described content (see
As was mentioned above, the receiving apparatus 4 can observe the 3D program details descriptor, and if there is this descriptor, thereby obtaining an effect of enabling to recognize the program, which is received at present or will be received in future, is the 3D program. In addition thereto, in case where the program is the 3D program, it is possible to identify or discriminate the type of the 3D transmission method, and if the 3D picture and the 2D picture are mixed with, it is possible to discriminate the type thereof.
Next, explanation will be given on an example of identifying or discriminating to be the 3D picture or the 2D picture by a unit of the service (composition channel).
Meaning of the service descriptor is as follows. Thus, “service_type” (service form type) is a field of 8 bits, and this shows a kind of the service, in accordance with that shown in
As was mentioned above, the receiving apparatus 4 can observe “service_type”, and thereby obtaining an effect of enabling to recognize that the service (the composition channel) is the channel of 3D program. In this manner, if it is possible to discriminate the service (the composition channel) is the 3D video service or the 2D video service, on the EPG display can be made such a display indicating that the corresponding service is the 3D video program broadcast service. However, in spite of the service mainly broadcasting the 3D video program, there may be a case where the 2D picture must be broadcasted, where a source of the advertising video is only the 2D video, for example. Accordingly, in discrimination of the 3D video service with using “service_type” of that service descriptor, it is preferable to execute the discrimination of the 3D video program in combination with “stream_content” (component content) and “component_type” (component type), as was mentioned previously, the discrimination of the 3D vido program with using “component_group_type” (component group type), or in combination with the discrimination of the 3D video program by means of the 3D program details descriptor. When executing the discrimination by combining plural numbers of information, it is possible to discriminate that, although being the 3D video broadcast service, it is the 2D picture in a part of the programs, etc. If such discrimination can be made, the receiving apparatus is able to indicates that the corresponding service is “3D video broadcast service”, clearly, on EPG, for example, and further, if the 2D video program is mixed with, it is possible to exchange a display control, etc., between the 3D video program and the 2D video program, depending on necessity thereof, when receiving the program and so on.
Meaning of the service descriptor is as follows. Thus, “service_id” (service identification) is a field of 16 bits, for identifying the information service in that transport stream, uniquely. The service identification is equal to the broadcast program number identification (“program_number”) in the program map section corresponding thereto. “service_type” (service form type) is a field of 8 bits, and this indicates the type of the service, in accordance with that shown in
With those “service_type” (service form type), since it is possible to discriminate to be “3D video broadcast service” or not, therefore, for example, with using the list of the composition channel and the type thereof, which are shown in that service list descriptor, it is possible to execute such a display of grouping only “3D video broadcast service” on the EPG display, etc.
As was mentioned above, the receiving apparatus 4 can observe “service_type”, and thereby obtains an effect of enabling to recognize that the composition channel is the channel of the 3D program.
The examples of the descriptors, which are explained in the above, describe only the representative members thereof, but it can be also considered to combine plural numbers of the members into one (1), or divide one of the members into plural numbers of members, each having detailed information, and so on.
The component descriptor, the component group descriptor, the 3D program details descriptor, the service descriptor and the service list descriptor of the program information, which are explained in the above, are information, which are transmitted from a transmitting apparatus 1, being produced and added in a management information assigning unit 16, and being stored in PSI of MPEG-TS (as an example, PMT, etc.) or in SI (as an example, EIT or SDT or NIT, etc.)
Hereinafter, explanation will be given on an example of a transmission management rule or regulation in the transmitting apparatus 1 of the program information.
In “component_type” is described the video component type of the corresponding component. The component type is determined from those shown in
In “text_char” is described 16 byte (8 double-byte characters) or less than that, as a name of the video type, when there are plural numbers of video components. No line feed (or, return) code is used. In case where description of the component is a character line of default, this filed can be omitted. The default character line is “video”.
However, one (1) is necessarily transmitted, for all the video components having “component_tag” value of “0x00-0x0F”, included in an event (program).
With managing in the transmitting apparatus 1 in this manner, the receiving apparatus 4 can observe “stream_component” and “component_type”, and thereby obtains an effect of enabling to recognize that the program, which is received at present or will be received in future, is the 3D program.
In “descriptor_tag” is described “0xD9” meaning that it is the component group descriptor. In “descriptor_length” is described the descriptor length of the component group descriptor. The maximum length of the descriptor is not defined. “component_group_type” indicates the type of the component group. “000” indicates a multi-view TV, and “001” indicates a 3D TV, respectively.
In “total_bit_rate_flag” is indicated “0” when all the total bit rates in the group within the event is at a predetermined default value, or “1” when any one of the total bit rates in the group within the event exceeds the predetermined default value.
In “num_of_grou” is described a number of the component groups within the event. It is assumed to be “3” at the maximum in case of the multi-view TV (MVTV), and “2” at the maximum in case of the 3D TV (3DTV).
In “component_group_id” is described a component group identification. “0x0” is assigned in case of a main group, and the broadcast undertaker is assigned, uniquely, within the event, in case of each sub-group.
In “num_of_CA_unit” is described a number of accounting/unaccounting units in the component group. The maximum number is assumed to be “2”. It is “0x1”, when there is no component included, on which accounting should be taken, at all, within the corresponding component group.
In “CA_unit_id” is described the accounting unit identification. To this is assigned the broadcast undertake, uniquely, within the event. In “num_of_component” is described a number of the components belonging to the corresponding accounting/unaccounting units and further shown by “CA_unit_id” just before. The maximum number is assumed to be “15”.
In “component_tag” is described a component tag value belonging to the component group. In “total_bit_rate” is described a total bit rate within the component group. However, “0x00” is described in case of a default value.
In “text_length” is described a byte length of description of the component group following thereafter. The maximum value is assumed to be “16” (8 double-byte characters). In “text_char” is described an explanation relating the component group, necessarily. No default character line is defined. Also, no line feed (or, return) code is used.
However, when executing the multi-view TV service, “component_group_type” is transmitted as “000”, necessarily. Also, when executing the 3D TV service, “component_group_type” is transmitted as “001”, necessarily.
With doing the transmission management in the transmitting apparatus 1, the receiving apparatus 4 can observe “component_group_type”, and thereby obtains an effect of enabling to recognize that the program, which is received at present or will be received in future, is the 3D program.
With executing the transmission management in the transmitting apparatus, the receiving apparatus 4 can observe the 3D program details descriptor, and thereby, if there is this descriptor, obtaining an effect of enabling to recognize that the program, which is received at present or will be received in future, is the 3D program.
The service type is determined from those shown in
In “char” is described the undertaker's name in case of the BS/CS digital television broadcast, in 10 double-byte characters. Nothing is described in case of the terrestrial digital television broadcast. In “service_name_length” is described the composite channel name length. The maximum value is assumed to be “20”. In “char” is described the composite channel name. It is within 20 bytes and within 10 double-byte characters. However, for each target composite channel, only one (1) is disposed, necessarily.
With doing the transmission management in the transmitting apparatus 1, the receiving apparatus 4 can observe “service_type”, and thereby obtains an effect of enabling to recognize that the composite channel is the 3D program channel.
In “service_id” is described “service_id” included in that transport stream. In “service_type” is described the service type of the target service. It is determined from those shown in
With doing the transmission management in the transmitting apparatus 1, the receiving apparatus 4 can observe “service_type”, and thereby obtains an effect of enabling to recognize that the composite channel is the 3D program channel.
In the above, although the explanation was given on an example of transmission of the program information within the transmitting apparatus 1; however, if executing the transmission, by inserting indication, “3D program will start from now”, “please ware glasses for 3D viewing when enjoying 3D display”, “2D display is recommended when viewer's eyes are tired or body condition is bad” or “viewing 3D programs for long time may bring about fatigue of eyes or bad condition of body”, etc., into the 3D program produced by the transmitting apparatus 1, with using a telop (subtitle), etc., on a first screen when the 3D program starts, in particular, when the program is changed from the 2D program to the 3D program, then there can be a merit that attention/alarming against viewing of the 3D program can be given to the user viewing the 3D program, on the receiving apparatus 4.
In the figure, a flow of signal connecting each block is shown like a single signal path, as an outlook thereof; however, there are cases where plural numbers of signals are transmitted/received, simultaneously, due to time-divided multiplexing, connecting via the multiple line, or the like. For example, between the multiplex divider unit 29 and the video decoder unit 30, plural numbers of video signals can be transmitted, at the same time; therefore it is also possible to execute the processes of, such as, decoding plural numbers of video ES in the video decoder unit, 2-screen display of the picture, and simultaneous decoding for recoding and viewing, etc.
System structures or configuration including the receiving apparatus and a viewing/listening apparatus and a 3D view assisting or supporting device (for example, 3D glasses) will be shown in
In
However, the explanation mentioned above was given to the example, upon an assumption that the display device 3501 and the 3D view supporting device 3502 shown in
Also in
In this case, the video signal outputted from the video output 41 of the video/audio output apparatus 3601 (the receiving apparatus 4) and audio signal outputted from the audio output 42, and the control signal outputted from the control signal output unit 43 are converted into a transmission signal of format, which is suitable to the format defined on the transmission path 3602 (for example, the format defined by HDMI regulation), and inputted to the display 3603, passing through the transmission path 3602. The display 3603 receives that transmission signal thereon decodes it into the original video signal, audio signal and the control signal, and thereby outputting the 3D view supporting device control signal 3503 to the 3D view supporting device 3502, as well as, outputting the video and audio.
However, the explanation mentioned above was given to the example, upon an assumption that the display device 3601 and the 3D view supporting device 3602 shown in
However, a part of each of the constituent elements of 21-46 shown in
Also, each module, as well as, each of hardware inside the receiving apparatus 4, executes communication of the information, through the common bus 22. Also, relation lines (i.e., arrows) described in the figures are shown, mainly, on the portions relating to the explanation given presently; however, also between other modules, there are processes, which needs the communication means and the communication. For example, the tuning controller unit 50 obtains the necessary program information from the program information analyzer unit 54, appropriately.
Next, explanation will be given on the function of each function block. A system controller unit 51 manages the condition of each module and an indication condition of the user, etc., and also indicates the control to each module. A user instructor receiver unit 52, receiving and interpreting an input signal of a user operation, which is received by a control signal transmitter/receiver 33, informs or transmits the instruction user to the system controller unit 51. An equipment control signal transmitter unit 53, in accordance with an instruction(s) from the system controller unit 51 and/or other module(s), gives an instruction to the control signal transmitter/receiver 33 to transmit the equipment control signal.
The program information analyzer unit 54 obtains the program information from the multiplex divider unit 29, to analyze the content thereof, and provides necessary information to each module. A time management unit 55 obtains time correction information (TOT: Time offset table) included in TS, from the program information analyzer unit 54, and thereby managing the present time, and at the same time, it gives a notice of alarm (a notice of reaching to the time designated) or one-shot timer (a notice of passage of a predetermined time-period), in accordance with a request of each module.
A network controller unit 56 controls the network I/F 25, so as to obtain various kinds of information and TS from a specific URL (Unique Resource Locater) and/or a specific IP (Internet Protocol) address. A decode controller unit 57 controls the video decoder unit 30 and the audio decoder unit 31, i.e., instructing start and stop of decoding, and obtaining information included in the stream, etc.
A recording/reproducing controller unit 58 controls the recording/reproducing unit 27, and thereby reading out the signal from the recoding medium 26, from a specific position of a specific content, in arbitrary format of read-out (ordinary, reproduction, fast-forward, rewinding, pause). Also, it executes control of recording the signal inputted into the recording/reproducing unit 27 to the recording medium.
A tuning controller unit 59 controls the tuner 23, the descrambler 24, the multiplex divider unit 29 and the decoding controller unit 57, and thereby receiving the broadcast signal and recording of the broadcast signal. Or, it executes reproduction from the recording medium, and it also executes controls outputting the video signal and the audio signal therefrom. About details of operations of broadcast receiving and recoding operations of the broadcast signal, and reproducing operations from the recording medium, they will be mentioned later.
An OSD producer unit 60 produces OSD data, including a specific message therein, and gives an instruction to a video conversion controller unit 61 to superimpose the OSD data produced on the video signal. Herein, the OSD producer unit 60 produces OSD data having parallax, such as, for the left-side eye and for the right-side eye, and requests the video conversion controller unit 61 to make the 3D display upon basis of the OSD data for the left-side eye and for the right-side eye, and thereby achieving a message display in 3D.
The video conversion controller unit 61 controls the video conversion processor unit 32, so as to superimpose the video obtained by converting the video signal inputted in the video conversion processor unit 32 into 3D or 2D, in accordance with an instruction from the system controller unit 51 mentioned above, and the OSD inputted from the OSD producer unit 60, and further execute processing on the video (scaling or PinP, or 3D display, etc.), or 2D/3D conversion, depending on necessity thereof, and thereby displaying it on the display 47 or outputting it to an outside. Details of methods of conversion of the 3D video or the 2D video into the predetermined format and the 2D/3D conversion within the video conversion processor unit 32 will be mentioned later. Each function block provides such function of those.
Herein, explanation will be given on controlling steps and flows of the signals when executing broadcast receiving. First of all, the system controller unit 51, receiving an instruction of the user (for example, pushing down the CH button on the remote controller), indicating to receive the broadcast of a specific channel (CH), from the user instructor receiver unit 52, instructs the tuning controller unit 59 to execute tuning into CH instructed by the user (hereinafter, “designated CH”).
The tuning controller unit 59 receiving the instruction mentioned above gives an instruction of receiving control at the designated CH (tuning into a designated frequency band, demodulation process of the broadcast signal, error correction process), to the tuner 23, and thereby driving it to output TS to the descrambler 24.
Next, the tuning controller unit 59 instructs the descrambler 24, to descramble the TS and to output it to the multiplex divider unit 29, and instructs the multiplex divider unit 29, to divide inputted TS from multiplexing, and also to output the video ES divided from multiplexing to the video decoding unit 30 and the audio ES to the audio decoder unit 31.
Also, the tuning controller unit 59 instructs the decoding controller unit 57 to decode the video ES and the audio ES, which are inputted into video decoder unit 30 and the audio decoder unit 31. The decoding controller unit 57 receiving the decoding instruction mentioned above controls the video decoder unit 30 to output the video signal decoded to the video conversion processor unit 32, while controls the audio decoder 31 to output the audio signal decoded to the speaker 48 or the audio output 421. In this manner, control for outputting the video and the audio of CH designated by the user is carried out.
Also, for displaying a CH banner (OSD for showing CH number or the program name, or the program information, etc.) when tuning, the system controller unit 51 instructs the OSD producer unit 60 to produce and output the CH banner. The OSD producer unit 60 receiving the instruction mentioned above transmits data of the CH banner produced to the video conversion controller unit 61, and the video conversion controller unit 61, receiving the data mentioned above, makes such control that the CH banner is superimposed on the video signal, to be outputted. In this manner, message display is carried out, when tuning, etc.
Next, explanation will be given about recording control of the broadcast signals and flows of the signals. When recoding a specific CH, the system controller 51 instructs the tuning controller unit 59 to tune into a specific CH and to output the signal to the recording/reproducing unit 27.
The tuning controller unit 59 receiving the above-mentioned instruction thereon, similar to the broadcast receiving process mentioned above, instructs the tuner 23 to control, so as to receive the designated CH, instructs the descrambler 24 to descramble the MPEG2-TS receiving from the tuner 23, and instructs the multiplex divider unit 29 to output the input from the descrambler 24 towards to the recording/reproducing unit 27.
Also, the system controller unit 51 instructs the recoding/reproducing controller unit 58 to record the TS inputted into the recording/reproducing unit 27. The recording/reproducing controller unit 58 receiving the instruction mentioned above executes necessary processes, such as, encryption, etc., upon the signal (TS) inputted into the recording/reproducing unit 27, and also, after producing additional information necessary when recoding/reproducing (i.e., content information, such as, the program information of recording CH and/or the bit rate thereof, etc.), and also after recording the management data (ID of recording content, recording position on the recording medium 26, recording format, encoding information, etc.), it executes a process for writing the management data into the recording medium 26. In this manner, recoding of the broadcast signal is carried out. Hereinafter, such recoding method mentioned will be called “TS recording”, for distinguishing from a method of executing the conversion, as will be mentioned below, to record.
Explanation will be given on an example wherein recoding is executed through other passage, in particular, when recording the broadcast signal after treating processes (for example, conversion of the format of the video signal and the audio signal or video compression, or 2D/3D conversion of video, etc.) upon the video and/or the audio included therein (hereinafter, “convert recording”). The system controller unit 51, similar to the TS recording, instructs the tuning controller unit 59 to output a tuning into the specific CH. The tuning controller unit 59 receiving the instruction mentioned above, similar to the broadcast receiving process mentioned above, instructs the tuner 23 to receive the designated CH, and instructs the descrambler 24 to control, so as to descramble the MPEG-2 TS received from the tuner 23, and also instructs the multiplex divider unit 29 to divide TS inputted from the descrambler 24, from multiplexing, and thereby to output to the video decoder unit 30 and the audio decoder unit 31. The video decoder unit 30 decodes the signal, and outputs the video to the video conversion processor unit 32. Herein, the video conversion processor unit 32 executes necessary conversion processes (format conversion of the video signal, the 2D/3D conversion process, etc.), and outputs the signal to the video encoding unit 35. The video encoding unit 35 receiving the output mentioned above encodes that signal, and outputs the video ES to the multiplex/composer unit 37. Similarly, the audio signal is also decoded in the audio decoder unit 31, and the audio signal is outputted to the audio encoder unit 36; then after being treated with necessary processes thereon in the audio encoder unit, the audio ES is outputted to the multiplex/composer unit 37. The multiplex/composer unit 37, inputting that video ES and that audio ES therein, obtains other information necessary for multiplexing (for example, the program information, etc.), from the multiplex divider unit 29, or from the CPU 21 depending on necessity thereof, and multiplexing it together with the above-mentioned video ES and the above-mentioned audio ES, thereby to output to the recording/reproducing unit 27.
Thereafter, similar to the case of the TS recording mentioned above, the system controller unit 51 instructs the recording/reproducing controller unit 58 to record the TS inputted from the multiplex/composer unit 37 to the recording/reproducing unit 27. The recording/reproducing controller unit 58 receiving the instruction mentioned above executes necessary processes, such as, encoding, etc., upon the signal (TS) inputted into the recording/reproducing unit 27, and after producing additional information necessary when recording/reproducing (i.e., content information, such as, the program information of recording CH and/or the bit rat thereof, etc.), and also after recording the management data (ID of recording content, recording position on the recording medium 26, recording format, encoding information, etc.), it executes a process for writing the management data into the recording medium 26. In this manner, recoding of the translated broadcast signal is carried out.
<Reproduction from Recording Medium>
Next, explanation will be given about reproducing proves from the recording medium. When reproducing a specific program, the system controller unit 51 gives an instruction of reproducing the specific program to the recording/reproducing controller unit 58. As the instruction in this instance, there are indicated an ID of content and a reproduction start point (for example, a top of program, a position of 10 minutes from the top, a position of 100 M bytes from the top, etc.) The recording/reproducing controller unit 58 receiving the instruction mentioned above controls the recording/reproducing unit 27, thereby to read out the signal (TS) from the recording medium 27 with using the additional information and the management information, and after executing the necessary processes, such as, decryption, the process is executed on the multiplex divider unit 29 to output TS.
Also, the system controller unit 51 gives an instruction of output of the video/audio signals of the reproduced signal to the tuning controller unit 59. The tuning controller unit 59 receiving the instruction mentioned above controls the input from the recording/reproducing unit 27 to be outputted to the multiplex divider unit 29, and instructs the multiplex divider unit 29 to divide the inputted TS from multiplexing, and to output the video ES divided from multiplexing to the video decoder unit 30, as well as, to output the audio ES divided from multiplexing to the audio decoder unit 31.
Also, the tuning controller unit 59 instructs the decoding controller unit 57 to decode the video ES and the audio ES, which are inputted into the video decoder unit 30 and the audio decoder unit 31, respectively. The decoding controller unit 57 controls the video decoder unit 30 to output the video signal decoded to the video conversion processor unit 32, and also controls the audio decoder unit 31 to output the audio signal decoded to the speaker 48 or the audio output. In this manner, processes for reproducing the signals from the recording medium are carried out.
As a method for displaying 3D picture, being applicable to the present invention, there are several methods, and wherein the pictures for the left-side eye and for the right-side eye are produced in such a manner that parallax is generated between the left-side eye and the right-side eye, and thereby causing a human to recognize that a cubic thing exists.
As one of the method, there is an active shutter method, wherein light shielding is done, alternately, between the left-side glass and the right-side glass, upon the glasses, which the user wears, with using a liquid crystal shutter, etc., and thereby generating the parallax on the screen reflecting or appearing on the left-side and right-side eyes.
In this case, the receiving apparatus 4 outputs sync signal and control signal from the control signal output unit 43 and the equipment control signal transmitting terminal 44, toward to an active shutter-type glasses, which the user wears. Also, the video signal is outputted from the video signal output unit 41 to the external 3D video display device or apparatus, to be displayed the picture for the left-side eye and the picture for the right-side eye thereon, alternately. Or, the similar 3D display is conducted on the display 47 that the receiving apparatus 4 has. Doing in this manner, the user wearing the active shutter-type glasses can enjoy or view the 3D picture on the display 47 that the 3D video display device or the receiving apparatus 4 has.
Also, as other method, there is a polarization method, with applying glasses or linear polarization forming coats on the left-side and right-side glasses, perpendicular to each other in the direction of linear polarization, or applying glasses or linear polarization forming coats on the left-side and right-side glasses, opposite to each other in the direction of circular polarization, upon the glasses, which the user wears, a polarized picture for the left-side eye and a polarized picture for the right-side eye are outputted, simultaneously, differing from each other corresponding to the left-side polarization and the right-side polarization on the glasses; i.e., separating or dividing the pictures to be incident upon the left-side eye and the right-side eye, respectively, depending on the polarization condition thereof, and thereby generating the parallax between the left-side eye and the right-side eye.
In this case, the receiving apparatus 4 outputs the video signal, from the video signal output unit 41 to the 3D video display device or apparatus, and then said 3D video display device displays the video for the left-side eye and the video for the right-side eye under conditions differing from each other. Or, the similar display is carried out by the display 47 that the receiving apparatus 4 has. With doing in this manner, the user wearing the polarization glasses can enjoy or view the 3D video or picture on said 3D video display device displays or the display 47 that the receiving apparatus 4 has. Further, with the polarization method, since the viewing/listening of the 3D video can be made, but without transmitting the sync signal and/or the control signal from the receiving apparatus 4 to the polarization glasses, there is no necessity of outputting the sync signal and/or the control signal from the control signal output unit 43 and the equipment control signal transmitting terminal 44.
Also, other than those, there may be applied a color separation method for separating the pictures for the left-side/right-side eyes depending on the colors. Or, there may be applied a parallax barrier method of creating the 3D picture with using the parallax barrier visible by bear eyes.
However, the 3D display method relating to the present invention should not be restricted to a specific method.
<Example of Detailed Determination Method of 3D Program with Using Broadcast Program>
As an example of the determining method of the 3D program, there is a method of obtaining the information for determining on whether it is a newly included 3D program or not, from various kinds of tables and/or descriptors included in the program information of the broadcast signals and reproduction signals, which are already explained, and thereby enabling to determine on whether it is the 3D program or not.
Determination is made on whether it is the 3D program or not, by confirming the information for determining to be the 3D program or not, which is newly included in the component descriptor or the component group descriptor, described on the table, such as, PMT or EIT [schedule basic/schedule extended/present/following], or confirming the 3D program details descriptor, which is a new descriptor for use of determination of the 3D program, or confirming the information for determining to be the 3D program or not, which is newly included in the service descriptor or the service list descriptor described on the table, such as, NIT, SDT, etc. Those information are attached to the broadcast signal in the transmitting apparatus mentioned previously, and are transmitted. In the transmitting apparatus, those information are assigned to the broadcast signal by the management information assignment unit 16.
As a way of using of each table, for example, with PMT, it has the following characteristics: since describing thereon only the information of the present programs, it is impossible to confirm the information of future programs, but has a high reliability. On the other hand, with EIT [schedule basic/schedule extended], although possible to obtain, not only the information of the present program, but also that of the future programs, however, it has the following demerits: i.e., it takes a long time until when completing receipt thereof, it needs a lot of memory areas for holding them, and it has a low reliability because they are future events. With EIT [following], since it is possible to obtain the information of the program on the next broadcasting time, and therefore is suitable for application into the present embodiment. Also, with EIT [present], it can be used for obtaining the present program information, and it can obtain the information different from that with PMT.
Next, explanation will be made on detailed example of the process in the receiving apparatus 4 relating to the program information, which is transmitted the transmitting apparatus 1 and is explained by referring to
When “descriptor_tag” is “0x50”, the corresponding descriptor is determined to be the component descriptor. With “descriptor_length”, it is determined to be the descriptor length of the component descriptor. If “stream_content” is “0x01”, “0x05”, “0x06” or “0x07”, then the corresponding descriptor is determined to be valid (video). In case where it is other than “0x01”, “0x05”, “0x06” and “0x07”, the corresponding descriptor is determined to be invalid. In case where the “stream_content” is “0x01”, “0x05”, “0x06” or “0x07”, the following processes are executed.
With “component_type”, the corresponding component is determined of the video component type thereof. Regarding this component type is designated any one of the values shown in
“component_tag” is a component tag value unique within the corresponding program, and can be used by referring to the component tag value of the stream descriptor of PMT.
With “ISO—693 language_code”, the character code disposed thereafter is treaded as “jpn”, even if it is other than “jpn(“0x6A706E”)”.
With “text_char”, characters within 16 bytes (8 double-byte characters) are determined to be the component description. If this field is omitted, it is determined to be a default component description. The default component description is “video”.
As was mentioned above, with the component descriptor, it is possible to determine the video component type building up the event (program), therefore the component descriptor can be used when selecting the video component within the receiving apparatus.
However, it is assumed that only the video component, “component_tag” value of thereof being set to “0x00”-“0x0F”, is a target of the selection alone. The video component, being set with “component_tag” value other than those mentioned above, does not become the target of the selection alone, and should not be used as a target for, such as, component selection function, etc.
Also, there are cases where the component description does not coincide with an actual component, due to mode change, etc., generated during the event (program). (In “component_type” of the component descriptor is described a representative component type of the corresponding component, but such doing of changing this value, in real time, responding to the mode change on the way of the program.)
Also, “component_type” described by the component descriptor is referred to, when determining the default “maximum_bit_rate” in case where the digital copy control descriptor thereof, being description of the information for controlling copy generation and the maximum transmission rate in digital recording equipment, is omitted therefrom, for the corresponding event (program).
In this manner, by executing the process for each field of that descriptor, the receiving apparatus 4 can observe “stream_content” and “component_type”, and thereby obtains an effect of enabling to recognize the program, which is received at present or will be received in future, is the 3D program.
When “descriptor_tag” is “0xD9”, the corresponding descriptor is determined to be the component group descriptor.
By means of “descriptor_length”, it is determine to be the descriptor length of the component group descriptor.
When “component_group_type” is “000”, it is determined to be the multi-view TV service, and when “001”, determined to be the 3D TV service.
When “total_bit_rate_flag” is “0”, it is determined that the total bit rate within the group in the event (program) is not described in the corresponding descriptor. If “1”, t is determined that the total bit rate within the group in the event (program) is described in the corresponding descriptor.
With “num_of_group”, it is determined to be the number of the component group in the event (program). There is defined the maximum value, and if exceeding that, there is possibility that it may be processed as that maximum value.
With “num_of_CA_unit”, it is determined to be the number of the accounting/unaccounting unit(s) in the component group. If exceeding the maximum value, there is possibility that it may be processes as “2”.
When “CA_unit_id” is “0x0”, it is determined to be the unaccounting unit group. If “0x1” it is determined to be the accounting unit, including a default ES group therein. If other than “0x0” and “0x1”, it is determined to be an accounting unit type other than those mentioned above.
With “num_of_component”, it is determined to be a number of the components, which belong to the corresponding component group and belong to the accounting/unaccounting unit indicated by “CA_unit_id” just before. If exceeding the maximum value, there is possibility that it may be processed as “15”.
With “component_tag”, it is determined to be the component tag value belonging to the component group, and it can be used by referring to the component tag value of the PMT stream descriptor.
With “total_bit_rate”, it is determined to be the total bit rate within the component group. However, when “0x00”, it is determined to be a default.
If “text_length” is equal to or less than 16 (8 double-byte characters), it is determined to be the component group length, otherwise, if larger than 16 (8 double-byte characters), part of explanation exceeding the 16 (8 double-byte characters) of the component group length can be neglected.
“text_char” indicates an explanation relating to the component group. Further, with disposition of the component group descriptor of “component_group_type”=” 000″, it can be determined that the multi-view TV service is conducted in the corresponding event (program); therefore, it can be utilized in the process for each of the component groups.
Also, with disposing the component group descriptor of “component_group_type”=“001”, it can be determined that the 3D TV service is conducted in the corresponding event (program); therefore, it can be utilized in the process for each of the component groups.
Further, default ES groups of each group are necessarily described in the component, which is disposed at the top of “CA_unit” loop.
In the main group (component_group_id=0x0) are described the followings:
Also, in the sub-group (component_group_id>0x0) are described the followings:
In this manner, by executing the process for each field of that descriptor, the receiving apparatus 4 can observe “component_group_type”, and thereby obtains an effect of enabling to recognize the program, which is received at present or will be received in future, is the 3D program.
When “descriptor_tag” is “0xE1”, the corresponding descriptor is determined to be the 3D program details descriptor. With “descriptor_length”, it is determined to be the descriptor length of the 3D program details descriptor. With “3d—2d_type”, it is determined to be the 3D/2D type in the corresponding 3D program.
This is designated from those shown in
With “stream_type”, it is determined to be the type of ES of the corresponding 3D program. This is designated from those shown in
Further, it is possible to apply such structure that the corresponding program can be determined to be the 3D video program or not, depending on presence/absence of the 3D program details descriptor itself. Thus, in this case, if there is no 3D program details descriptor, it is determined to be the 2D video program, otherwise, of there is the 3D program details descriptor, then it is determined to be the 3D video program.
In this manner, by executing the process for each field of that descriptor, the receiving apparatus 4 can observe the 3D program details descriptor, and thereby obtains an effect of enabling to recognize that the program, which is received at present or will be received in future, is the 3D program.
With “service_provider_name_length”, if equal to or less than “20”, it is determined to be the undertaker name length of the, while larger than “20”, the undertaker name is determined to be invalid, in case of receiving the BS/CS digital television broadcasts. On the other, in case receiving the terrestrial digital television broadcast, it is determined to be invalid if other than “0x00”.
With “char”, it is determined to be the undertaker's name, in case of receiving the BS/CS digital television broadcasts. On the other hand, in case receiving the terrestrial digital television broadcast, the content described is neglected. If “service_name_length” is equal to or less than “20”, it is determined to be the composite channel name length, while larger than “20”, the composite channel name length is determined to be invalid.
With “char”, it is determined to be the composite channel name. However, if impossible to receive SDT, in which the descriptor is disposed in accordance with the transmission management rule explained in
In this manner, by executing the process for each field of that descriptor, the receiving apparatus 4 can observe the “service_type”, and thereby obtains an effect of enabling to recognize that the composite channel is a channel of the 3D program.
In “loop” is described a loop of the service number included in the target transport stream. With“service_id”, it is determined to be “service_id” for the corresponding transport stream. “service_type” indicates the service type of the target service. Other than those shown in
As was explained in the above, the service list descriptor can be the information of the transport stream included in the target network.
In this manner, by executing the process for each field of that descriptor therein, the receiving apparatus 4 can observe “service_type”, and thereby obtains an effect of enabling to recognize the composite channel is the channel of the 3D program.
Next, explanation will be made about the detailed descriptors within each table. First of all, although it is possible to determine the type or format of ES, depending on the type if data within “stream_type” described in a 2nd loop of PMT, as was explained in
Also, other the “stream_type”, it is also possible to make the determination in the region, while newly assigning the 2D/3D identification bit for identifying to be the 3D program or the 2D program, in relation with the region, which is set “reserved” at present in PMT.
With EIT, similarly, it is also possible to make determination while assigning the 2D/3D identification bit, newly, into the region of “reserved”.
When determining the 3D program with using the component descriptor, which is disposed on PMT and/or EIT, as was explained in
As the determination method with using the component group descriptor disposed in EIT, as was explained in
As the determination method with using the 3D program details descriptor disposed in PMT and/or EIT, as was explained in
In case where assigning the 3D video service to “0x01”, as was explained in
Also, about the program information, there is a method of obtaining it through a communication pass for exclusive use thereof (broadcast signal or Internet). In that case, it is possible to determine the 3D program, in the similar manner, if there are starting time of the program, CH (broadcast composite channel, URL or IP address), and the descriptor indicating on whether it is the 3D program or not.
In the explanation given in the above, the explanation was given about various information (the information included in the tables and the descriptors) for determining to be the 3D video or not, by the unit of service (CH) or program; however, those are not always necessary to be transmitted, in the present invention. It is enough to transmit necessary information, fitting to a broadcasting mode. Among those information, after confirming independent or single information, respectively, determination of being the 3D video, or not, can be made by the unit of service (CH) or program, or determination of being the 3D video, or not, can be made by the unit of service (CH) or program, combining plural numbers of information. In case when determining by combining the plural numbers of information, although relating to the 3D video broadcast service, it is also possible to determine that a part of programs is the 2D video, etc. If such determination can be made, it is possible to display that the corresponding service is “3D video broadcast service” on the receiving apparatus, for example, with EPG, and also, if the 2D video program(s) is/are mixed, other than the 3D video program(s), in that service, it is possible to exchange the display control, etc., between the 3D video program and the 2D video program, when receiving programs.
However, in case where it is determined to be the 3D program, according to the determination method of the 3D program, which was explained in the above, if the 3D components designated in
Next, explanation will be given about the process when reproducing the 3D content (digital content including 3D video). Herein, first of all, explanation will be given on the reproducing process in case of 3D/2D aspect separated ES transmission method, wherein there are such a main aspect video ES and a sub-aspect video ES for one (1) ES, as shown in
When the present program is the 3D program, the system controller unit 51, first of all, instructs the tuning controller unit 59 to output the 3D program. The tuner controller unit 59 receiving the instruction mentioned above, first of all, obtains PID (packet ID) and coding method (for example, H.264/MVC, MPEG 2, H.264/AVC, etc.), with respect to each of the main aspect video ES and the sub-aspect video ES, from the program analyzer unit 54, and next, it executes control on the multiplex divider unit 29, so that it divide the main aspect video ES and the sub-aspect video ES from multiplexing, and thereby outputting them to the video decoder unit 30.
Herein, the multiplex divider unit 29 is controlled so that the main aspect video ES is inputted into a 1st input of the video decoder unit and the sub-aspect video ES is inputted into a 2nd input thereof, respectively. Thereafter, the tuning controller unit 59 transmits information to the decoding controller unit 57, indicating that the 1st input of the video decoder unit 30 is for the main aspect video ES and the 2nd input thereof is for the sub-aspect video ES, and further instructs to decode those ESs therein.
For decoding the 3D program differing the coding method between the main aspect video ES and the sub-aspect video ES, such as the combination example 2 and the combination example 4 of the 3D/2D aspect separated ES transmission method as shown in
For decoding the 3D program being same of the coding method between the main aspect video ES and the sub-aspect video ES, such as the combination example 1 and the combination example 3 of the 3D/2D aspect separated ES transmission method as shown in
The decoding controller unit 57 receiving the instruction mentioned above executes the decoding on the main aspect video ES and the sub-aspect video ES, corresponding to the coding methods thereof, and thereby it outputs the video signals for the left-side eye and the right-side eye to the video conversion processor unit 32. Herein, the system controller unit 51 instructs the video conversion controller unit 61 to execute the 3D output process. The video conversion controller unit 61 receiving the above instruction from the system controller unit 51 controls the video conversion processor unit 32, so as to output them from the video output 41, or display the 3 D picture on the display that the receiving apparatus 4 has.
Explanation will be given about that 3D reproduction/output/display method, by referring to
When applying the method shown in
Also, when displaying the video signals mentioned above on the display 47 equipped with the receiving apparatus 4, with applying the method shown in
Further, in any of the system structures or configuration shown in
Explanation will be made on the operation when executing 2D output/display of the 3D content of the 3D/2D aspect separated ES transmission method. When the user instructs exchange to the 2D video (for example, pushing down the “2D” key on the remote controller), then the user instruction receiver unit 52 receiving the key code mentioned above, instructs the system controller unit 51 to exchange the signal to the 2D video (however, in the processes hereinafter, the same processes are executed, even when switching is made to the 2D output/display under the condition other than the exchange instruction by the user to the 2D output/display of the 3D content of the 3D/2D aspect separated ES transmission method). Next, the system controller unit 51 instructs the tuning controller unit 59 to output the 2D video, at first.
The tuning controller unit 59 receiving the instruction mentioned above, first of all, obtaining ES for the 2D video (the above-mentioned main aspect ES or the ES having a default tag) from the program information analyzer unit 54, and controls the multiplex divider unit 29 so as to output the above-mentioned ES to the video decoder unit 30. Thereafter, the tuning controller unit 59 instructs the decoding controller unit 57 to decode that ES. Thus, in the 3D/2D aspect separated ES transmission method, since the sub-stream or the ES differs from between the main aspect and the sub-aspect, it is sufficient to decode only the main aspect steam or ES.
The decoding controller unit 57, receiving the instruction mentioned above, controls the video decoder unit 30, so as to decode the ES mentioned above, thereby to output the video signal to the video conversion processor unit 32. Herein, the system controller 51 controls the video conversion processor unit 61 so as to output the 2D video therefrom. The conversion processor unit 61, receiving the instruction mentioned above, outputs the 2D video signal from the video output terminal 41 to the video conversion processor unit 32, or executes control so as to display the 2D picture on the display.
Explanation will be given about that 2D output/display method, by referring to
Herein, although description was made about the method not decoding the ES for the right-side eye, as the 2D output/display method; however, while decoding both ES for the left-side eye and ES for the left-side eye, the 2D display may be achieved by executing thinning upon the video signal for the right-side eye in the video conversion processor unit 32, thereby achieving the 2D display. In that case, there is no necessity of a process for exchanging the decoding process and the multiplex dividing process, and therefore there can be expected effects of reduction of exchanging time and simplification of software processing, etc.
Next, explanation will be given on the reproducing process of the 3D content in case where the video for the left-side eye and the video for the right-side eye are in one (1) video ES (for example, the left-side eye and the video for the right-side eye are stored in one (1) of the 2D videos, such as, in the Side-by-Side method or the Top-and-Bottom method). Similarly to the above, when the user instructs to change to the 3D picture, then the user instruction receiver unit 52, receiving the key code mentioned above, instructs the system controller unit 51 to switch to the 3D picture (however, in the processes hereinafter, the same processes are executed even in the case where the switching to the 2D output/display is made under the condition other than that where the exchange instruction is made by the user to change the 3D content according to the Side-by-Side method or the Top-and-Bottom method to the 2D output/display). Next, similarly, the system controller 51 determines on whether the present program is the 3D program or not in accordance with the method mentioned above.
In the present program is the 3D program, the system controller 51 instructs the tuning controller unit 59, at first, to output the 3D video. The tuning controller unit 59 receiving the instruction mentioned above, firstly, obtains PID (packet ID) of the 3D video ES including the 3D video and the coding method thereof (for example, MPEG 2, or H.264/AVC, etc.) from the program analyzer unit 54, and next, controls the multiplex divider unit 29 so as to divide the said 3D video ES from the multiplexing, thereby to output it to the video decoder unit 30, and also controls the video decoder unit 29 to execute the decoding process corresponding to the coding method and to output the video signal decoded to the video conversion processor unit 32.
Herein, the system controller 51 instructs the video conversion controller unit 61 to execute the 3D output process. The video conversion controller unit 61, receiving the instruction mentioned above, instructs the video conversion processor unit 32 to divide the video signal inputted into the video for the left-side eye and the video for the right-side eye, so as to treat the process, such as, scaling, etc. (the details thereof will be mentioned later) thereon. The video conversion processor unit 32 outputs the video signals converted from the video output 41, or display the picture on the display equipped with the receiving apparatus 4.
Explanation will be given about the reproduction/output/display method of that 3D video, by referring to
In
In the method shown in
Explanation will be given about the operation of each part when executing the 2D display of the 3D content according to the Side-by-Side method or the Top-and-Bottom method, hereinafter. When the user instructs to change to the 2D picture (for example, pushdown of “2D” key on the remote controller), then the user instruction receiver unit 52 instructs the system controller unit 51 to exchange to the 3D video (however, in the process hereinafter, the same processes are executed, even when changing to the 2D output/display under the condition when the user instructs to change to the 2D output/display of the 3D content according to the Side-by-Side method or the Top-and-Bottom method). The system controller unit 51, receiving the instruction mentioned above, instructs the video conversion controller unit 61 to output the 2D video. The video conversion controller unit 61, receiving the instruction mentioned above, controls the video conversion processor unit 32 to execute the 2D video output of the inputted video signal mentioned above.
Explanation will be given about the 2D output/display method of videos, by referring to
The video conversion processor unit 32 outputs the video signal, on which the above-mentioned process is treated with, as the 2D video from the video output 41, and outputs the control signal from the control signal 43.
However, also examples of executing the 2D output/display while keeping the 3D contents of the Side-by-Side method and the Top-and-Bottom method received as the 2 aspects in one (1) video or picture are shown in
Explanation will be given on an example when the 2D video (the video not having depth information and/or parallax information) is converted into the 3D video.
Analysis is made on the 2D video for each picture, and comparison is made on stereoscopic determination elements (form of a body (size, shape), color difference, brightness, chroma, contrast, sharpness of the body, change of shading, position of the body (layout), or determining stereoscopic relationship by conducting filtering process, etc.), and thereafter, depth information (depth-map) is produced for each pixel or region. An example of the depth information is shown in
In this example, it is assumed that the depth is uniform for each object; however, the depth may change within an object, and therefore there can be considered a depth-map by a unit of pixel. In that case, the depth information can be defined by the unit of pixel, then it is possible to emphasize 3D of the picture, much more; however, there are cases where an amount or volume of calculation becomes large. Also, about a numerical value of the depth information may be an arrangement of assigning “0” to the forefront and a small value (a minus value) to the pixel or layer, which is determined to locate in the depth than that, relatively.
Next, upon basis of the depth information mentioned above, a virtual stereoscopic vision of the picture is obtained (for example, the pixel is allocated at a position (x, y, z) on 3D plane). an example of that is shown in
With processing the picture for the left-side eye and the picture for the right-side eye, which are produced in this manner, in the similar manner to the 3D output method of the 3D content mentioned above, display of the picture can be made in 3D.
Also, as other method, there is a method of determining the depth information for plural numbers of frames, by calculating the stereoscopic determining element with using plural numbers of video frames, or distinguishing between a dynamic object (i.e., an object having movement. For example, an object having a motion vector, a vector quantity of which is equal to or greater than a predetermined value) and a background or a static object (an object having no or less movement. For example, an object having a motion vector, a vector quantity of which is smaller than the predetermined value), and thereby calculating the depth information in such a manner that the motive object comes close to a front surface to be cubic, etc.
Also, in other method, there is a method of treating one (1) frame of continuing frames having a movement (for example, a frame at time “t”) as the picture for the left-side eye, and treating the frame at other time (for example, a frame at time “t+a”) as the picture for the right-side eye. With such method, there is a merit that it can be done with less volume of calculation; however, there is a demerit that the 3D picture converted is hardly seen in 3D, other than a specific movement (for example, the motive object moves horizontally on the static screen).
Also, relating to portion, which cannot be viewed by a specific frame (i.e., not photographed), there is a method of supplementing the video information from other frame(s), and thereby making up the pictures for both eyes.
With those 2D/3D conversion methods, it is possible to produce a picture that can be recognized to be 3D, easily, for the user, with an accuracy much higher, by combining plural numbers of the determining elements and/or the processing methods.
<Example of Video Display Processing Flow Fitting to User Condition, when Program Changes>
Next, explanation will be given on the output/display process when the broadcasting method (the 3D program and the transmission method thereof, the 2D program) is changed of the program, which is under viewing/listening at present. When the broadcasting method is changed of the program, which is under viewing/listening at present, and if the method for processing the video is not changed within the receiving apparatus, in particular, there is a possibility that a normal video display cannot be performed, and therefore loosing a convenience for the user. Contrary to this, by executing the processes, which will be shown below, it is possible to improve or increase up the convenience for the user.
The system controller unit 51 obtains the program information of the present program from the program analyzer unit 54, thereby to determines on whether the present program is the 3D program or not, according to the determining method of the 3D program, and further it obtains the 3D method type of the present program (such as, the 2 aspects separated ES transmission method/the Side-by-Side method, etc., determined from the 3D method type described in the 3D program details descriptor), at the same time (S201). However, the program information of the present program may be obtained, not limited to the time when the program changes, but may be obtained periodically. If obtaining the program information, periodically, it is effective in the case where the 3D video and the 2D video are mixed up within the same program.
As a result of determination, if it is the 3D program (“yes” of S202), then next, confirmation is made on a 3D view preparation condition of a user (S204).
The 3D view preparation condition means a condition where the user makes preparation for viewing/listening the 3D video or picture. For example, after pushing down the “3D” button on the remote controller, and in particular, like a case when the user selects “see 3D” on an exchange display of 3D/2D, such as, shown in the menu of
Also, determination of the 3D view preparation condition of the user, other than that, may be made by a user wearing completion signal, generated by the 3D view support device, or while photographing the viewing/listening condition of the user by a photographing device or apparatus, so as to execute an image recognition or a face recognition of the user from the result of photographing, the determination may be made that she/he wears or put on the 3D view support device.
Also, as the operation for determining the 3D view preparation condition to be “NG”, for example, when the fact that the user presents an intention of not viewing/listening the 3D program, through her/his action, for example, the user wears outs the 3D view support device, or pushes down the “2D” button on the remote controller, is transmitted to the receiving apparatus, passing through the user operation input unit 45, for example, then the system controller unit 51 sets the 3D view preparation condition to “NG”, and holds the condition thereof.
When the 3D view preparation condition of the user is “OK” (“yes” of S205), the 3D content is outputted in 3D, in the format corresponding to the 3D method type, respectively, according to the method mentioned above (S206).
Also, when the 3D view preparation condition of the user is not “OK” (“no” of S205), the system controller unit 51 controls so as to display one aspect (for example, the main aspect) of the 3D video signal in 2D, in the format corresponding to the 3D method type, respectively, according to the method explained in
As a result of determination of step S202, if the present program is not 3D (“no” of S202), similar to the mentioned above, confirmation on the 3D view preparation condition of the user (S208), as well as, determination (S209) are executed. As a result of the determination, if the 3D view preparation condition of the user is “OK” (“yes” of S209), according to the method mentioned above, the 2D/3D conversion is executed on the video, thereby displaying the video in 3D (S210).
Herein, there can be considered the case where a mark indicative of being executing the 2D/3D conversion (2D/3D conversion mark) is displayed, when executing the 2D/3D conversion, thereby outputting the video. In this case, the user can distinguish between the 3D provided by the broadcast and the 3D produced by the apparatus, and as a result thereof, the user can also decide to stop the 3D viewing/listening thereon.
Also, herein, in case where the apparatus has no 2D/3D converting function, the 2D/3D video may be controlled to output in 2D as it is, without controlling the 2D/3D conversion in the step S210.
Also, when the 3D view preparation condition of the user is not “OK” (“no” of S209), the system controller unit 51 controls the broadcast signal of 2D to be outputted in 2D as it is (S203).
In this manner, determination is made on the broadcasting method of the present broadcast (the 3D program and the transmission method thereof, the 2D program) and on the 3D view preparation condition of the user, and thereby it is possible to output the video in the format suitable to them, automatically.
Herein, as the method for determining the 3D program, by making the determination on whether to be the 3D program or not or determination of the 3D method type, with using the descriptor stored in the user data region or additional information region, which is encoded together with the video, it is possible to control the conversion mentioned above by a unit of frame, and thereby improving the convenience or operability of the user.
In case where the user pushes down the “OK” button on the remote controller, for example, when displaying the message 1601, the user instruction receiver unit 52 notices that the “OK” button is pushed down, to the system controller unit 51.
As an example of the method for determining the user selection on the screen display shown in 38, when the user operates the remote controller and pushes down the “3D” button, or when she/he adjust a cursor to “OK/3D” button on the screen and pushes down a “OK” button, determination is made that the 3D view preparation condition is “OK”.
Or, when the user pushes down a “Cancel” button or a “Return” button on the remote controller, or when she/he adjusts the cursor to “Cancel” on the screen and pushes down “OK” on the remote controller, the 3D view preparation condition is determined to be “NG”. Other than this, when such an operation to bring the 3D view preparation condition mentioned above into “OK” is done, then the 3D view preparation condition is changed to “OK”.
After the user makes the selection mentioned above, the flow shown in 46 is executed, again, in the system controller unit 51.
With this, even when the 3D program is displayed in 2D, under the condition where the user is “NG” of the 3D view preparation condition, it is possible to inform the user that the 3D program starts, and also to notice that the 3D view preparation condition is in “OK” to the apparatus, easily. Upon those results, the user can decide starting of the 3D program, and can change or switch to the 3D video or picture, easily; thereby enabling to provide a viewing/listening method fitting to convenience of the user.
However, in the example of display shown in
Further, as other example of the message display displayed in the step S207, not only displaying “OK” simply, as shown in
With doing so, comparing to the display of “OK” as shown in
The message display to each user, which are explained in the present embodiment, preferably, is deleted after the operations made by the user. In such case, there can be obtained a merit that the picture can be viewed, easily. Also, when passing a predetermined time-period, similarly, it can be considered that the user already recognize the information of message, then the message is deleted and thereby bringing the picture to be seen, easily, and thereby increasing the convenience for the user.
Further, even in case where the present program is changed after conducting the tuning operation, the flow mentioned above is executed within the system controller unit 51.
Herein, explanation will be given about a method for displaying a picture having no depth (being 2D), in spite of the fact that the broadcast signal is of the 3D transmission method, in a part or the entire 3D program. Under such condition or situation, i.e., when the user enjoys viewing/listening with considering to be the 3D program, there occur cases where the user receives an uncomfortable feeling or displeasure, if a plane picture having no depth is outputted, suddenly. Also, in case where much higher 3D video can be outputted by the 3D video obtained through the 2D/3D conversion within the apparatus, than the 3D video included in the original content, it is possible to increase the convenience for the user, by outputting the video obtained through the 2D/3D conversion of the apparatus.
First of all, explanation will be given about the method for determining depth of the picture of 3D program. The picture having less depth can be considered the picture having less difference, between the pictures of separated aspects (hereinafter, “separated aspect picture(s)”), for the left-side eye and the right-side eye, respectively. Then, as an example, there is a method for determining the picture having no depth, when the difference is lower than a predetermined value, by calculating the difference of numerical values, such as of R, G and B or Y, U and V, respectively, for example, for each pixel being same of the position of the picture display, on the separate aspect picture, and then comparing a total sum of those differences to a difference of the picture, as a predetermined value.
In more details thereof, in case of the picture, i.e., the 3D transmission method thereof is “Side-by-Side”, size in the horizontal direction of the entire picture is “X” (thus, size in the horizontal direction of the picture each aspect is “X/2”), and size in the vertical direction thereof is “Y”, the difference can be calculated by the following equation (1), if comparing the difference of the separate aspect picture by Y, U and V components:
Where, the left-hand side presents the total sum of the difference values of the YUV components of the picture, and the right-hand side is a constant value (herein, D). Also, an equation, Y(x,y) indicates a value of Y component of the picture on (x,y) coordinates thereof, and also U(x,y) and V(x,y) are similar to.
Herein, with calculation while setting the constant value (d) to “0”, determination can be made that it is the picture having no depth, only if the pictures of 2 aspects coincide with, completely (namely, the condition that there is completely no depth information).
As the method for determining, other than the example of difference of each pixel, there are methods, such as, comparing histogram of each element of both pictures (for example, Y, U and V, or R, G and B), or comparing the difference, relating to a result of calculating a specific digital filter (for example, a high-pass filter) on both pictures, etc.
Explanation will be given about a processing flow of the system controller unit 51, applying those depth determinations therein, by referring to
Next, determination is made on whether the process of converting from the 2D video to the 3D video is necessary or not (2D/3D conversion necessity determination) (S903). As a method for determining, the result of determination of the depth mentioned above is applied, for example. Thus, the 2D/3D conversion is determined necessary, when the pixel difference of the picture is equal to or less than a predetermined value (i.e., the equation (1) is true), while determining the 2D/3D conversion unnecessary, when the pixel difference of the picture is equal to or greater than the predetermined value (i.e., the equation (1) is false). When determination is not made that the 2D/3D conversion is necessary (“no” of S903), no process is executed, in particular.
On the other hand, when determination is not made that the 2D/3D conversion is necessary (“yes” of S903), the 3D video is converted into 2D (S904). As a method of conversion, for example, when displaying the 3D video mentioned above in 2D, the 2D video is outputted according to the method described. Next, on the above-mentioned 2D video converted, the 2D/3D conversion is executed, according to the method mentioned above (S905).
As was mentioned above, in case of the 3D picture having no sense of depth, for example, with execution of the 2D/3D conversion of the video on side of the apparatus, it is possible to obtain the sense of depth.
Although the explanation was made on the example of making the determination of necessity by analyzing the picture, in the determination of necessity of the 2D/3D conversion; however, after determining the 2D/3D conversion with using a flag included in the signal (for example, a 2D/3D conversion flag), the process mentioned above may be executed. With this, for the transmitting side, it is possible to notice the receiving side of being the picture, upon which the 2D/3D conversion may be executed or should be executed, with using the flag, and thereby enabling to control the necessity/unnecessity of execution of the 2D/3D conversion in the receiving apparatus.
Also, by executing the control with using the flag mentioned above on the receiving apparatus side, it is possible to provide a picture that can be considered appropriate for the 2D/3D conversion, after executing the conversion thereon. Also, the processes, such as, the depth determining process, etc., in the example mentioned above, are unnecessary, thereby bring about a merit that the processing load in the apparatus can be lighten or reduced.
As a position where the 2D/3D flag should be inserted, there can be considered a method of describing it at the position similar to the position where the information is described in the example of the determining method of the 3D program mentioned above. In case of describing into the program information, since frequency of renewing is low, there can be obtained a feature that the processing load for confirming the flag is reduced within the apparatus, and if inserting it within a header of the video signal, although there is also a possibility of increasing the processing load for confirming the flag; however, it is possible to confirm the flag by a unit of stream of video, and there is a case that quality of the picture to be provided can be improved, by switching the flag by the unit of frame, for example.
In case where the flag mentioned above is not included in the signal, it may be treated, as “the 2D/3D conversion is inhibited”, or on the contrary thereof, as “the 2D/3D conversion is permitted”, for example.
Or, as other method for determining the necessity of the 2D/3D conversion, there can be considered a method of determining it depending on setup made by the user. For example, with using the screen of user setup as shown in
Also with a method other than those mentioned above, the user setup may be switched by pushing-down of the button on the remote controller (for example, “3D on apparatus/3D on broadcast switching button”). In this manner, if the user determines the necessity of the 2D/3D conversion by her/himself, it is possible to display a preferable one, between the 3D picture, which is already given to the video by the user, intentionally, or the picture 2D/3D converted on the apparatus.
Also, as further other method for determining the necessity of the 2D/3D conversion, for example, in case that there is no video information of any aspect (for example, the stream of sub-aspect (for the right-side eye) is not transmitted), within the streams, which are transmitted by the 3D 2-aspects separated ES transmitting method, etc., it is preferable to determine the 2D/3D conversion is necessary. With making such determination, it is possible to output the video converted into 3D within the apparatus, automatically, such as, where there is only the picture of an aspect of one side (for example, where the ES of the one side is not transmitted with the 3D 2-aspects separated ES transmitting method). In this case, in the step S904 shown in
Determination of necessity of those 2D/3D conversions may be made, combining the conditions, respectively. For example, even in the case where the 2D/3D flag is “not need conversion”, if determination of the picture is “need conversion” and the selection made by the user is “need conversion”, the 2D/3D conversion is executed, etc., i.e., it is possible to execute the 2D/3D conversion fitting to a favor of the user much more, by determining it depending on the respective priorities and/or combinations.
Having done a convert recording of the video, which is 2D/3D converted as mentioned above, it is not necessary to execute the similar process when reproducing; therefore, the processing load when reproducing is lightened or reduce, and further delay is lessened on display. Or, outputting the content, which is 2D/3D converted and the convert recording is made thereof, to an outside (for example, a high-speed digital I/F output, or a network output), there can be obtained a merit that the 2D/3D converted video can be viewed/listened or enjoyed on an external equipment having no 2D/3D converting function.
When executing the convert recording accompanying the 2D/3D conversion, it is preferable to change each descriptor or flag, etc., into the content that shows “3D”, in particular, the descriptors or the flags, etc., which are applied in the method for determining the 3D program mentioned above, within the video encoder unit 35 or the multiplex divider unit 37, etc. Also, with description of the 3D method type converted, it is preferable to adapt the 3D method type to be applied above-mentioned method for determining the 3D program, too, fitting to the above-mentioned content, which is converted.
Also, when executing the convert recording, the information described in the program information (for example, EIT) shows 3D, and in case where the information described in the stream (for example, a user data area of MPEG) shows 2D, etc., it is preferable to execute the 2D3D conversion, automatically. This is because there can be assumed a case where the video is changed from 3D to 2D on the way of the program, and in such case, the video of the entire program is changed to 3D by executing the 2D/3D conversion, and thereby it is possible to increase the convenience for the user when viewing/listening the reproduction.
About setting up of recording format for executing TS recording or the convert recording, there can be considered a method of selecting the recording format depending on selection by user, while setting up a selection content in advance by the user. It is possible to make the following operation; such as, execute TS recording as recording, even if the picture under the viewing/listening is the 2D/3D converted video, or on the contrary to that, execute the convert recording accompanying the 2D/3D conversion on the recording side, but without executing the 2D/3D conversion on the video to be viewed, etc., and it is possible to increase the convenience for he user.
An example of a setup screen for the setup mentioned above is shown in
Next, explanation will be given about an output/display process of content when the next program is 3D content. Relating to viewing/listening of a 3D content program in case where the next program is 3D content, i.e., of said next program, if display of the 3D content is started, in spite of the fact that the user is not under the condition of viewing/listening the 3D content, then the user cannot view/listen that content under the best condition, therefore there is a possibility of loosing the convenience of the user. On the contrary to this, with execution of the following processes, it is possible to increase the convenience for the user.
In
When the next program is not the 3D program (“no” of S102), the process is ended, but without executing processes, in particular. When the next program is the 3D program (“yes” of S102), time up to starting of the next program is calculated. In more details, the starting time of the next program or the ending time of the present program are obtained from EIT of the program information mentioned above, which is obtained, and obtains the present time from the time management unit 55, thereby calculating the difference thereof.
When it is not equal to or less than X min. until starting of the next program (“no” of S103), the system controller unit waits for, without executing the process, in particular, until the time, i.e., X min. before starting of the next program. When it is equal to or less than X min. until starting of the next program (“yes” of S103), a message indicative of that the 3D program will start, soon, is displayed to the user (S104).
About the determination time X until starting of the program, if making X small, there is brought about a possibility the 3D view preparation by the user is not in time. Also, when making X large, there can be considered demerits, such as, display of the message for a long time becomes an obstacle of the viewing/listening, an interval is made after completion of the preparation; therefore, it is necessary to adjust it to an appropriate time-period.
Also, when displaying the message to the user, the starting time of the next program may be displayed in details thereof. An example of the display on the screen in that case is shown in
However, although the example of the time-period until the 3D program is started is shown in
Also, as is shown in
Next, explanation will be given about the method for determining a condition of whether the 3D view preparation is completed or not, and thereby changing the video to the 2D display or the 3D display, after noticing to the user that the next program is 3D.
The method for noticing to the user that the next program is 3D is as was mentioned above. However, this differs from that mentioned above, in an aspect that, in particular, about the message to be displayed to the user in the step S104, it is an object to be responded by the user (hereinafter, a user response receiver object: for example, a button on OSD). An example of this message is shown in
A reference numeral 1001 depicts a message, as a whole, and 1002 a button, for the user to make a response, respectively. When displaying the message 1001 shown in
The system controller unit 51 receiving that information mentioned above reserves the fact that the 3D view preparation condition of the user is “OK” as the condition. Next, after time passes by, when the present program becomes the 3D program, the process flow in the controller system unit 51 is same to the video display process fitting to the user condition when the program changes, as was explained in the above.
Also, in the example mentioned above, there can be considered that the process is executed by only determining the program information of the next program, which was obtained previously. In this case, there can be also considered a method of using the program information, which is obtained previously (for example, in the step S101 shown in
With such message display to each user as was explained in the present embodiment, it is preferable to be deleted after the operation by the user. In such case, there can be obtained a merit that the user is able to view/listen the picture, easily, after she/he makes the operation. Also, after passing a predetermined time-period, similarly, by considering that the user already notices the information of the message, the message is deleted, and thereby brought into the condition that the picture can be viewed, easily; this increases the convenience of the user.
With the embodiment explained in the above, on a scene where the 3D program and the 2D program are exchanged, etc., it is possible to execute the most suitable exchange control judging from the condition of the user and the condition of the broadcast program, and also, with the picture displayed at that occasion, it is possible to provide the most suitable 3D picture to the user, by executing the 2D/3D conversion judged from the characteristic of the picture, the condition of the broadcast signal, and the setup values made by the user.
Also, bringing the converted video mentioned above into be recorded, there can be expected to have the following effects: i.e., enabling reduction of the load when reproducing/displaying and/or the delay, the most suitable display of the picture at a point of exchanging of the picture also upon the reproduction by the equipment having no such 2D/3D converting function, etc.
In the explanation given in the above, the explanation was given on the example of transmitting the 3D program details descriptor, which was explained in
As the information to be stored can be listed up: “3d—2d_type” (3D/2D type) information, which is explained in
In more details, if the picture coding method is the MPEG 2 method, the coding may be done on the user data area or region following “Picture header”, and “Picture Coding Extension”, including the 3D/2D type information and the 3D method type information therein.
Also, if the picture coding method is the H.264/AVC method, the coding may be done on the additional information included in an access unit (supplemental enhancement information), including the 3D/2D type information and the 3D method type information therein.
In this manner, transmitting the information indicative of the type of the 3D picture or the 2D picture, and/or the information indicative of the type of the 3D method, on a coding layer of the picture within ES brings about an effect that the picture can be identified by a unit of frame (or picture).
In this case, since the identification or discrimination mentioned above can be made by a unit shorter than that when storing it into PMT (Program Map Table), it is possible to improve or increase the response speed of a receiver in response to the exchange between 3D video/2D video on the picture transmitted, and therefore noises, having a possibility of generating at the time of exchanging between 3D video/2D video, can be suppressed, much more.
Also, in case where no 3D program details descriptor mentioned above is disposed on the PMT (Program Map Table), but in case of storing the information mentioned above on the video/picture coding layer, which is encoded together with the picture when encoding the picture, in particular, when 2D/3D mixture broadcast is started, newly, at the conventional broadcast station, for example, for the broadcast station side, it is sufficient to renew only the encoder unit 12 in the transmitting apparatus 1 shown in
However, if the 3D relation information, such as, “3d—2d_type” (3D/2D type) information and/or “3d_method_type” (3D method type) information, etc. (in particular, the information for identifying the 3D/2D), is not stored in the predetermined region(s) or area(s), such as, the user data area or region and/or the additional information region or area, which is/are encoded together with the picture when encoding the picture, then the receiver may be constructed so that it determines such video is the 2D picture. In this case, the broadcast station can omit to storage of those information when processing the encoding, regarding the 2D picture, and it is possible to reduce the number of steps for processing in broadcasting.
In the explanation given in the above, as the example of disposing or arranging the identification information for discriminating or identifying the 3D video, by a unit of the program (event) or a unit of service, the explanation was made on the example of including it in the program information, such as, the component descriptor, the component group descriptor, the service descriptor and the service list descriptor, etc., or the example of providing the 3D program details descriptor, newly. Also, it is mentioned those descriptors are included on the table(s), such as, PMT, EIT [basic/schedule extended/present/following], NIT, SDT, etc., and transmitted.
Herein, as a further other example, explanation will be made of an example of disposing or arranging the identification information of the 3D program (event) within the content descriptor (Content descriptor) shown in
The structure of the component descriptor is as follows. “descriptor_tag” is a field of 8 bits for identifying the descriptor itself, in this descriptor are described the content descriptor and a distinguishable value thereof “0x54”. “descriptor_length” is a field of 8 bits, and describing size of this descriptor therein.
“content_nibble_level—1” (genre 1) is a field of 4 bits, and presents a first stage classification of content identification. In more details, there is described a large or rough classification. When presenting the program characteristic, “0xE” is designated.
“content_nibble_level—2” (genre 2) is a field of 4 bits, and presents the second stage classification of content identification in more details than “content_nibble_level—1” (genre 1). In more details, there is described a middle-level classification of the program genre. If “content_nibble_level—1”=“0xE”, the type on the program characteristic code table is described therein.
“user_nibble” (user genre) is a field of 4 bits, and describes the program characteristics therein, only when “content_nibble_level—1”=“0xE”. In case other than that, it is assumed to be “0xFF” (no definition). As is shown in
The receiver receiving that content descriptor determines that the said descriptor is the content descriptor if “descriptor_tag” is “0x54”. Also, it is possible to determine the end of data described in the present descriptor, by means of “descriptor_length”. Further, processing is executed, with determining the description of parts, equal to or less than the length indicated by “descriptor_length”, to be effective, while neglecting the description exceeding that.
Also, the receiver determines on whether the value of “content_nibble_level—1” is “0xE” or not, and determines it as the large classification when deciding “0xE”. When not “0xE”, it is not determined as the genre (category); but it is so determined that any program characteristic is designated in the following “user_nibble”.
The receiver determines “content_nibble_level—2” is the middle classification of the program genre (category) when the value of “content_nibble_level—1” is not “0xE”, to be used together with the large classification of the program genre (category) in searching or displaying, etc. When the above-mentioned “content_nibble_level—1” is “0xE”, determination is made that it indicates a type on the program characteristic code table, which is defined by a combination of the first “user_nibble” bit and the second “user_nibble” bit.
The receiver determines that, when the above-mentioned “content_nibble_level—1” is “0xE”, the first “user_nibble” bit and the second “user_nibble” bit are bits for indicating the program characteristic in combination thereof. when the above-mentioned “content_nibble_level—1” is not “0xE”, the first “user_nibble” bit and the second “user_nibble” bit are neglected, even if any value is inserted therein.
Therefore, when the value of “content_nibble_level—1” of the descriptor is not “0xE”, the broadcast station is able to transmit the genre (category) information of the target event (program) to the receiver, in combination of the values of “content_nibble_level—1” and “content_nibble_level—2”.
Herein, explanation will be given about an example, as shown in
In this case, the receiver is able to determined the large classification of the program genre (category) to be “news/press” or “sports”, depending on the value of “content_nibble_level—1”, and further determine the middle classification of the program genre (category), which is leveled to be lower than the large classification of the program genre (category), such as, “news/press” or “sports”, by the combination of the value of “content_nibble_level—1” and the value of “content_nibble_level—2”.
However, for achieving such the determining process as mentioned above, it is enough to memorize a genre code table information for indicating a corresponding relationship of definitions between the combination of the value of “content_nibble_level—1” and the value of “content_nibble_level—2” and the program genre (category), within a memory unit, which the receiver has.
Herein, explanation will be made about a case of transmitting the program characteristic information of the 3D program relation of the target event (program) with using that content descriptor. Hereinafter, explanation will be made about the case where the identification information of the 3D program is transmitted, not as the program genre (category), but as the program characteristic.
First of all, when transmitting the program characteristic information relating to the 3D program with using the content descriptor, the broadcast station transmits the content descriptor with setting “content_nibble_level—1” thereof to “0xE”. With this, the receiver is able to determine the information transmitted by that content descriptor is, not the genre (category) information, but the program characteristic information of the target event (program). Also, with this, it is possible to determine that the first “user_nibble” bit and the second “user_nibble” bit indicate, which are described in the content descriptor, indicate the program characteristic information by the combination thereof.
Herein, explanation will be given about an example, as shown in
In this case, the receiver is able to determine the program characteristics relating to the 3D programs of the target event (program) by the combination of the value of the first “user_nibble” bit and the value of the second “user_nibble” bit, and the receiver receiving EIT, which includes that descriptor therein, is able to make a display of explanation, that “no 3D picture is included” about the program, which will be received in future or which is received at present, that “3D picture program” about that program, and that “3 D picture and 2 D picture are included” about that program, or a display of graphics indicating that, on the display of the electronic program table (EPG).
Also, the receiver receiving EIT including or containing that content descriptor therein is able to search or pick up the program including no 3D picture therein, the programs including the 3D picture therein, and the pictures including the 3D picture and the 2D picture therein, etc., and thereby making a list display of those programs, so on.
Further, for achieving that determining processes, it is enough to memorize the program characteristic code table information, in advance, indicating the corresponding relationship between the combination of the first “user_nibble” bit and the value of the second “user_nibble” bit, in the memory unit that the apparatus has.
Also, as an example of other definition of the program characteristic information relating to the 3D programs, explanation will be given about the case where, for example, as shown in
In this case, the receiver is able to determine the program characteristics relating to the 3D programs of the target event (program), by the combination of the value of the first “user_nibble” bit and the value of the second “user_nibble” bit, and also to determine, not only on whether the 3D picture is included or not in the target event (program), but also of the 3D transmitting method when including the 3D video therein. If the receiver memorizes the information of the 3D transmission method operable therewith (3D reproducible) into the memory unit owned by the receiver, the receiver is able to make a display of explanation, that “3D picture is included” about the program, which will be received in future, or which is received at present, that “3D picture is included, and can be 3D reproduced by this receiver” about that program, and that “3D picture is included, but not 3D reproduced by this receiver” about that program, or a display of graphics indicating that, on the display of the electronic program guide (EPG), by comparing the information of the 3D transmission method operable (reproducible), which is memorized in the memory unit in advance, and the information of the 3D transmission method of the target event (program), which is determined by the content descriptor included in EIT.
Also, in the example mentioned above, although the program characteristic is defined “3D picture is included in target event (program), and 3D transmission method is 3D 2-aspects separated ES transmission method” when the value of the first “user_nibble” bit is “0x3” and also when the value of the second “user_nibble” bit is “0x3”; however, the value of the second “user_nibble” bit may be prepared for each of detailed combinations of “3D 2-aspects separated ES transmission method” shown in
Also, the information of the 3D transmission method of the target event (program) may be displayed.
Also, the receiver receiving EIT including that content descriptor therein is able to search or pick up the programs including no 3D picture therein, the programs including the 3D picture therein and 3D reproducible, and the programs including the 3D picture but 3D un-reproducible in, etc., and thereby making a list display of those programs, and so on.
And a program search can be made on each 3D transmission method, relating to the programs including the 3D picture therein, and also the list display can be made of the programs by each 3D transmission method. Further, the search of the program(s), which include(s) therein but cannot be 3D reproduced by the present receiver, and/or the program search by each 3D transmission method is/are effective if they can be reproduced by other video program reproducing apparatus owned by the user, for example even if they cannot be 3D reproduced by the present receiving apparatus. This is because, even with the program including the 3D picture therein, which cannot be 3D reproduced by the present receiver, it is also possible to output that program, keeping the transport stream format thereof as it is, from the video output portion of the present receiver to other 3D video program reproducing equipment, thereby reproducing in 3D the program of the received transport stream format, on the 3D video program reproducing equipment, and also it is possible to reproduce in 3D the program mentioned above, which is recorded on that removable medium by the other 3D video program reproducing equipment mentioned above, if there is a recoding unit for recording the content onto a removable medium in the present receiver.
However, for achieving such determination process as was mentioned above, it is enough to memorized the program characteristic code table information, in advance, indicating the corresponding relationship of the combination of the value of the first “user_nibble” bit and the value of the second “user_nibble” bit and the definition of the program characteristic, and also the information of the 3D transmission method, being compatible or operable with the receiver (3D reproducible), into the memory unit that the receiver has.
The present invention may be embodied in other specific forms without departing from the spirit or essential feature or characteristics thereof. The present embodiment(s) is/are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the forgoing description and range of equivalency of the claims are therefore to be embraces therein.
Number | Date | Country | Kind |
---|---|---|---|
2010-176925 | Aug 2010 | JP | national |