This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2010-031822, filed Feb. 16, 2010; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a playback apparatus, which plays back moving image data, and to a method of controlling a playback apparatus.
Recently, a notebook personal computer including a processor having a main processor whose processing capability is low, a so-called Netbook, has been brought onto the market. The Netbook has a low processing capability, and in particular, the capability of playback a moving image such as a DVD video. In some cases, a dropped frame is unperiodically generated.
Jpn. Pat. Appln. KOKAI Publication No. 2002-108599 discloses the following technique. According to the technique, image data is transferred to a display device within a range of the maximum frame rate preset in accordance with the response speed of the display device. In this way, moving image data is displayed within a range of the capability of the display device.
A general architecture that implements the various feature of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, a playback apparatus includes a graphic unit, a central unit, and a control unit. The graphic unit is configured to generate a video signal to be outputted to a display device. The central unit is configured to decode a coded moving image data, and to execute a program in order to generate a second moving image data to be inputted to the graphic unit based on the decoded moving image data. The control unit is configured to change a frame rate of a moving image data in accordance with a capability of the central unit.
First, the configuration of a playback apparatus according to one embodiment of the present invention will be described with reference to
The personal computer 10 is able to record and play back video content data (audio-visual content data) such as broadcast program data and video data input from an external device. In other words, the personal computer 10 has a television (TV) function for watching and recording broadcast program data broadcast by a television broadcasting signal. For example, the foregoing television TV function is realized by a television (TV) application program previously installed in the personal computer 10. Moreover, the television (TV) function has a function of recording video data input from an external audio-visual (AV) apparatus and a function of playback recorded video data and recorded broadcast program data.
The display unit 12 is attached to the computer main body 11 so that it is freely rotatable between the following positions. One is an open position where the upper surface of the computer main body 11 is exposed. The other is a closed position where the display unit 12 covers the upper surface of the computer main body 11. The computer main body 11 has a thin box-shaped housing. The upper surface of the computer main body 11 is provided with a keyboard 13, a power button 14, an input control panel 15, a touchpad 16 and speakers 18A and 18B. The power button 14 turns on/off the power of the computer 10.
The input control panel 15 is an input device for inputting an event corresponding to a pressed button. The panel 15 is provided with a plurality of buttons for starting up each of a plurality of functions. The foregoing button group includes a control button for controlling a television (TV) function (watching, recording, playback recorded broadcast program data/video data).
The system configuration of the computer 10 will be described below with reference to
As shown in
The CPU 101 is a processor, which controls the operation of the computer 10. The CPU 101 executes an operating system (OS) 201 and various application programs such as a DVD application program 202, which are loaded from the hard disk drive (HDD) 111 from the main memory 103. The DVD application program 202 is software for playback a DVD loaded in the DVD drive 112. The DVD application program 202 is stored in the computer-readable medium. Further, the CPU 101 executes a Basic Input/Output System (BIOS) stored in the BIOS-ROM 109. The BIOS is a program for performing software control. For example, INTEL and ATOM® are used as the foregoing CPU 101.
The north bridge 102 is a bridge device, which makes a connection between a local bus of the CPU 101 and the south bridge 104. Further, the north bridge 102 has a built-in memory controller, which controls the access of the main memory 103. Further, the north bridge 102 has a function of performing a communication with the GPU 105 by means of a serial bus conforming to the PCI EXPRESS standard.
The GPU 105 is a display controller, which controls a liquid crystal display (LCD) 17 used as a display monitor of the computer 10. The GPU 105 uses the VRAM 105A as a work memory. A display signal generated by the GPU 105 is supplied to the LCD 17.
The south bridge 104 controls each device on a Low Pin Count (LPC) bus and each device on a Peripheral Component Interconnect (PCI) bus. Further, the south bridge 104 has a built-in Integrated Drive Electronics (IDE) controller for controlling hard disk drive (HDD) 111 and DVD drive 112. Further, the south bridge 104 has a function of performing a communication with the audio controller 106. The audio controller is an audio source device, and outputs playback target audio data to speakers 18A and 18B.
The wireless LAN controller 114 is a wireless communication device, which performs a wireless communication conforming to the IEEE 802.11 standard, for example. The IEEE 1394 controller 115 performs a communication with an external apparatus by means of a serial bus conforming to the IEEE 1394 standard.
The embedded controller/keyboard controller IC (EC/KBC) 116 is a one-chip microcomputer, which is integrated with an embedded controller for power management, and a keyboard controller for controlling keyboard (KB) 13 and touchpad 16. The embedded controller/keyboard controller IC (EC/KBC) 116 has a function of turning on/off the power of the computer in accordance with an operation of the power button 14 by user.
DVD-VIDEO data played back by the DVD drive 112 is supplied to a DVD navigation 501. The DVD-VIDEO data is encrypted by a content scramble system (CSS). The navigation 501 decrypts the encrypted data to separate a video pack (V_PCK), a sub-picture pack (SP_PCK) and an audio pack (A_PCK) from the decrypted data. Then, the navigation 501 supplies the audio pack (A_PCK) to an audio decoder 511. Further, the navigation 501 supplies the foregoing video pack (V_PCK) and sub-picture pack (SP_PCK) to a sub-picture decoder 541.
In this case, the DVD-VIDEO is stored with 30-FPS interlaced moving image data. When a DVD is played back and a moving image is displayed on the LCD 17, a sheet of frame image is generated from neighboring field images, and thereby, 60-FPS progressive moving images are displayed on the LCD 17.
The audio decoder decompresses compressed and coded audio information to convert it into non-compression audio data, and then, supplies the audio data to an audio rate converter 512. The audio rate converter 512 converts the audio data into a suitable sampling rate, and thereafter, supplies it to an audio renderer 513. The audio renderer 513 synthesizes the supplied audio data and audio data generated by other software operating on the computer, and then, supplies the synthesized audio data to an audio driver 514. The audio driver 514 controls the audio controller 106 so that audio is output from speakers 18A and 18B.
In the video decoder 521, if line 21 data is included, the video decoder 521 supplies the line 21 data to a line 21 decoder 522. The video decoder 521 decodes a video pack (V_PCK) to generate a field image. The sub-picture decoder 541 decodes a sub-picture pack (SP_PCK). The decoded data is supplied to an expansion video renderer 523.
A mixer 523A of the expansion video renderer 523 executes interlaced/progressive conversion (I/P conversion) with respect to a plurality of frames supplied from the video decoder 521. In this way, the mixer 523A generates a frame image from the field image. The generated frame image is supplied to a presenter 523B.
The presenter 523B executes the following processings; namely, it synthesizes a sub-picture (compressed sub-picture pack) and a closed caption in a frame image and renders a frame.
Moving image data output from the presenter 523B is supplied to a display driver 524. The display driver 524 controls the GPU 105 to display a moving image on the LCD 17.
A player shell/user interface 531 executes a processing related to display of a playback control panel. Moreover, the player shell/user interface 531 notifies a command corresponding to a button operated by user to a Media Foundation 510 by way of a graphic manager/media foundation player 532. The Media Foundation 510 controls topology formed of navigation 501, audio decoder 511 and video decoder 521 in accordance with the notified command.
When the capability of the CPU 101 is low, time is taken to execute a processing for releasing the CSS; for this reason, a frame supplied to the presenter 523B is delayed. As a result, a dropped frame is unperiodically generated. This apparatus changes a frame rate of moving image data output to the GPU 105 in accordance with the capability of the CPU 101 in order to prevent a dropped frame from being unperiodically generated. Hereinafter, the configuration of changing a frame rate of moving image data output to the GPU 105 in accordance with the capability of the CPU 101 will be described.
The graphic manager/media foundation player 532 acquires the capability of the CPU 101, that is, the number of cores and the clock rate, from the operating system 201 when the computer starts. Based on the acquired capability, the player 532 determines whether or not there is a possibility that a dropped frame is unperiodically generated. Then, in accordance with the determined result, the player 532 sets a frame rate of a moving image data output to the GPU 105 from the presenter 532B by way of the display driver 524.
The graphic manager/media foundation player 532 sets an I/P conversion method executed by the mixer 523 to any of first and second conversion methods in accordance with the capability of the CPU 101. More specifically, if the capability of the CPU 101 satisfies a preset capability and there is no possibility that a dropped frame is unperiodically generated, the mixer 423A gives instructions to execute I/P conversion using the first conversion method to the media foundation 510. Conversely, if the capability of the CPU 101 does not satisfy a preset capability and there is a possibility that a dropped frame is unperiodically generated, the mixer 423A gives instructions to execute I/P conversion using the second conversion method to the media foundation 510.
According to the foregoing first conversion method, the mixer 523A generates a frame image using neighboring field images. As seen from
Moreover, according to the foregoing second conversion method, the mixer 523A generates a frame image using field images in the same frame. As seen from
Namely, the frame rate of moving image data according to the I/P conversion using the second conversion method becomes lower than that of moving image data according to the I/P conversion using the first conversion method. Therefore, the frame rate is reduced, and thereby, this is equivalent to the case where a dropped frame is unperiodically generated. When a dropped frame is unperiodically generated, it bothers the user. However, when a dropped frame is periodically generated, user unlikely to notice the generation of a dropped frame by a residual image; therefore, the user is not so bothered.
The procedure of setting a frame rate of moving image data output to the GPU 105 will be described below with reference to a flowchart of
Firstly, the graphic manager/media foundation player 532 acquires data on the capability (the number of cores, the operation frequency, etc.) of the CPU 101 from the operating system 201 (step S11).
First, the graphic manager/media foundation player 532 determines whether or not an acquired capability satisfies a preset condition (step S12). According to the foregoing preset condition, the CPU 101 has a plurality of cores therein, and the operating frequency of the CPU 101 is more than 2 GHz.
If it is determined that the capability of the CPU 101 satisfies the preset condition (Yes in step S12), the graphic manager/media foundation player 532 gives instruction to execute I/P conversion using the first conversion method to the media foundation 510 (step S13). Conversely, it is determined that the capability of the CPU 101 does not satisfy the preset condition (No in step S12), the graphic manager/media foundation player 532 gives instruction to execute I/P conversion using the second conversion method to the media foundation 510 (step S14).
When playback is performed, the mixer 523A generates a frame image in accordance with the instruction from the graphic manager/media foundation player 532, and then, outputs the generated frame image to the presenter 523B.
As described above, a frame rate is set in accordance with the capability of the CPU 101. In this way, a dropped frame is prevented from being non-continuously generated when moving image data played back; therefore, the user is unlikely to be bothered.
The foregoing embodiment relates to the case where DVD-VIDEO is played back. In this case, a frame rate may be changed in accordance with the capability of a CPU when other moving image data is played back.
Furthermore, the CPU 101 and the GPU 105 may be embedded in a semiconductor chip.
The various modules of the systems described herein can be implemented as software applications, hardware and/or software modules, or components on one or more computers, such as servers. While the various modules are illustrated separately, they may share some or all of the same underlying logic or code.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the sprit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2010-031822 | Feb 2010 | JP | national |