This application claims priority under 35 U.S.C. § 119(a) to Korean Patent Application No. 10-2012-0003352, which was filed in the Korean Intellectual Property Office on Jan. 11, 2012, the content of which is incorporated herein by reference.
1. Field of the Invention
The present invention relates generally to a Three-Dimensional (3D) display apparatus and a method thereof, and more particularly to a 3D display apparatus and a method that can change a screen display state depending on the rotating state of the 3D display apparatus.
2. Description of the Related Art
With the development of electronic technology, various types of electronic devices have been developed and spread. In particular, types of display devices, such as a Television (TV), a mobile phone, a Personal Computer (PC), a notebook PC, and a Personal Data Assistant (PDA), have been widely used even in private homes.
As the use of display devices is increased, user needs for more diverse functions have increased. In order to meet such user needs, respective manufacturers have successively developed products having new functions.
Therefore, devices having 3D display functions have recently proliferated. Such devices may be implemented by devices such as a 3D TV used in homes and in devices such as a 3D television receiver, monitors, a mobile phone, a PDA, a set top PC, a tablet PC, a digital photo frame, and a kiosk. Further, 3D display technology may be used in diverse fields that require 3D imaging, such as science, medicine, design, education, advertisement, and computer games.
In a 3D display apparatus, a screen that includes a plurality of objects having different depth perceptions is displayed. A user perceives the 3D effect due to a difference in depth perception between respective objects. However, when a user intends to control the operation of the 3D display apparatus, the user may require a 3D screen. That is, various types of menus, which are displayed on the screen to control the operation of the 3D display apparatus, may visually conflict with the objects that are displayed in 3D on the screen. Further, the menus may be hidden by the objects being displayed or the objects may be hidden by the menus being displayed on the screen to cause the menu selection and operation to be difficult.
The present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below.
Accordingly, an aspect of the present invention provides a 3D display apparatus and a method thereof, which can adjust the depth perceptions of objects displayed on a screen if the apparatus is rotated while a 3D display is performed.
According to one aspect of the present invention, a 3D display method in a 3D display apparatus includes displaying a 3D screen including a plurality of objects having different depth perceptions, and displaying the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
According to another aspect of the present invention, a 3D display apparatus includes a display unit displaying a 3D screen including a plurality of objects having different depth perceptions, a sensing unit sensing a movement state of the 3D display device, and a control unit controlling the display unit to display the 3D screen with a unified depth perception through adjustment of the depth perceptions of the plurality of objects to one depth perception if the 3D display apparatus moves to a first state.
According to embodiments of the present invention, the screen display state is changed depending on the rotating state of the 3D display apparatus, and thus the user can control the operation of the 3D display apparatus more conveniently and easily.
The above and other aspects, features and advantages of the present invention will be more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which:
Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings. In the following description, specific details such as detailed configuration and components are merely provided to assist the overall understanding of these embodiments of the present invention. Therefore, it should be apparent to those skilled in the art that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
Referring to
The display unit 110 may be driven in manners depending on the 3D display method. That is, the 3D display method may be divided into a glasses type and a non-glasses type depending on whether 3D glasses are worn. The glasses type 3D display method may be further divided into a shutter glass type and a polarization type.
The shutter glass type 3D display method is a method in which a synchronization signal is transmitted to the 3D glasses so that a left-eye shutter glass and a right-eye shutter glass are alternately opened at a corresponding image output time while a left-eye image and a right-eye image are alternately displayed through the display unit 110. When the shutter glass type 3D display method is performed, the display unit 110 alternately displays the left-eye image and the right-eye image. The left-eye image and the right-eye image mean image frames configured so that the same objects are spaced apart from each other to have disparities corresponding to the depth perceptions of the objects. For example, if object 1 displayed in the left-eye image and object 1 displayed in the right-eye image are spaced apart from each other for disparity 1, and if object 2 displayed in the left-eye image and object 2 displayed in the right-eye image are spaced apart from each other for disparity 2, the depth perceptions of object 1 and object 2 become different from each other.
The polarization type 3D image display method is a method in which a left-eye image and a right-eye image are divided for each line, and the divided left-eye image line and right-eye image line are alternately arranged to generate and output at least one image frame. In this case, the display unit 110 causes the respective lines have different polarization directions by using a polarizing film attached to a panel. The 3D glasses that a user wears have the left-eye glass and the right-eye glass that transmit lights having different polarization directions. Accordingly, the left-eye image line is only recognized by the left eye, and the right-eye image line is recognized only by the right eye, so that a viewer can feel the 3D effect corresponding to the object disparity between the left-eye image and the right-eye image. In the case of the non-glass type 3D image display method, the display unit 110 includes a lenticular lens array or a parallax barrier. The display unit 110 divides the left-eye image and the right-eye image for each line and alternately arranges the respective lines to form and display at least one image frame. The light for each line of the image frame is dispersed to a plurality of viewing areas by the lenticular lens array or the parallax barrier. The respective viewing areas may be formed at an interval of about 65 mm that is the binocular disparity of a human.
The left-eye image and the right-eye image may form a 3D image of broadcasting content or multimedia reproduction content, or a 3D image that includes types of User Interface (UI) windows or objects, such as widget, image, and text.
The sensing unit 130 senses the motion state of the 3D display apparatus 100. The motion states include states to which the 3D display apparatus can move, such as an inclined state, a rotating state, and a movement state. The sensing unit 130 may include a geomagnetic sensor, a gyro sensor, and an acceleration sensor. Accordingly, the sensing unit 130 can sense whether the 3D display apparatus 100 is placed in the vertical direction or in the horizontal direction, or whether the 3D display apparatus 100 is in the horizontal state or in an inclined state, through measurement of a rotating angle, a pitch angle, a yaw angle, and a roll angel of the 3D display apparatus 100.
The sensing unit 130 may be implemented to include at least one of the types of sensors as described above, and its sensing method may differ depending on the type of the sensor. For example, when the sensing unit is provided with a two-axis fluxgate geomagnetic sensor, the sensing unit measures the size and direction of an external magnetic field through sensing the size of an electrical signal of the two-axis fluxgate that is changed depending on the rotating state thereof. Since the output values detected from the respective fluxgates is affected by the inclination, the pitch angle, the roll angle, and the yaw angle of the 3D display apparatus 100 may be calculated using the output values. If the pitch angle and the roll angle become “0”, the sensing unit 130 determines that the 3D display apparatus 100 is placed in the horizontal state based on the ground surface, while if the pitch angle and the roll angle become 90 degrees, the sensing unit 130 determines that the 3D display apparatus 100 is put upright in the vertical or horizontal direction. The sensing unit 130 can also determine the rotating direction of the 3D display apparatus according to the size and sign of the yaw angle. These values may differ depending on the direction of the sensor put in the 3D display apparatus 100.
In addition, the sensing unit 130 may adopt types of sensors that are known in the art, and the detailed description and illustration thereof will be omitted.
The control unit 120 controls the operation of the display unit 110 depending on the result of the sensing by the sensing unit 130. Specifically, if it is sensed that the 3D display apparatus 100 moves to the first state, the control unit 120 may control the display unit 110 to unify the depth perceptions of the respective objects that are being displayed through the display unit 110 into one depth perception.
The display unit 110 may unify the depth perceptions by adjusting the disparity through shifting the positions of the objects displayed in the left-eye image and the right-eye image for forming the 3D screen.
Further, according to an embodiment of the present invention, the first state may be defined in several manners. For example, the first state may be a horizontal state in which the 3D display apparatus 100 is horizontally put on the ground surface, a state in which the 3D display apparatus 100 is rotated in a horizontal or vertical direction, a state in which the 3D display apparatus 100 is inclined over a predetermined inclination, a state in which the 3D display apparatus 100 is rotated over a predetermined rotating angle, a state in which the 3D display apparatus 100 moves toward or away from the user side while maintaining the inclination, or a state in which the 3D display apparatus 100 is moving to form a specified pattern. In the description, it is assumed that the horizontal state is defined as the first state.
If the 3D display apparatus 100 is rotated in a state in which the 3D screen 10 that includes the respective objects Ob1, Ob2, and Ob3 is displayed, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are adjusted to be unified into one depth perception. That is, as shown in
Although it is shown that only the depth perceptions are adjusted depending on the motion of the 3D display apparatus 100 in
Referring to
Such layout change may be selectively performed depending on the types of content displayed on the 3D screen 10. That is, the distances for general objects, as shown in
The above-described layout change may be differently applied depending on the types of content. That is, if there are objects present that belong to the same type, such as a plurality of images or menu icons, the layout may be changed so that the distances, sizes, and shapes of the objects coincide with one another for each type. By contrast, if a plurality of objects having different types is present, the layout may be change so that the objects are grouped for each type.
When the distances among the objects are unified as shown in
As shown in
In particular, if the 3D display apparatus 100 moves to the original state after the menu display or the layout change is performed as shown in
In the embodiments illustrated in
If the 3D display apparatus 100 is rotated to the first state in a state in which the 3D display apparatus 100 is rotated to the second state and the menu 20 is displayed, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are unified into one depth perception in a state in which the menu 20 is displayed.
If the 3D display apparatus 100 moves to the first state in the above-described state, the depth perceptions of the respective objects Ob1, Ob2, and Ob3 are unified into one depth perception in a state in which the layout of the 3D screen 10 is changed.
Although
Further, even in the embodiments shown in
The types of 3D display methods as shown in
Referring to
According to the embodiments of the present invention, the menu may be displayed along with the depth perception adjustment as shown in
Referring to
If the 3D display apparatus moves in the opposite direction and returns to the original state in the above-described state in step S825, the depth perception is readjusted to the original state in step S830.
If the 3D display apparatus is rotated to the second state in step S835, the screen display state is changed in step S840. Specifically, the change of the screen display state may be a process of additionally displaying a menu on the 3D screen or a process of changing the layout through rearrangement of the objects on the 3D screen.
If the 3D display apparatus returns from the second state to the previous state in step S845, the screen display state is readjusted to the previous state in step S850. These steps are performed until the 3D display is finished in step S855. Although
As described above, the first state and the second state may be defined as states, and the layout may also be changed in diverse manners.
A program for performing the methods according to embodiments of the present invention as described above may be stored in various types of recording media.
Specifically, such a program may be stored in various types of recording media that can be read by a terminal, such as a RAM (Random Access Memory), a flash memory, a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electronically Erasable and Programmable ROM), a register, a hard disk, a removable disk, a memory card, a Universal Serial Bus (USB) memory, and a CD-ROM.
While the present invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that changes in form and detail may be made therein without departing from the spirit and scope of the present invention, as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0003352 | Jan 2012 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
8692853 | Kim et al. | Apr 2014 | B2 |
9041779 | Yoshino | May 2015 | B2 |
20070046630 | Hong et al. | Mar 2007 | A1 |
20100188503 | Tsai et al. | Jul 2010 | A1 |
20100189413 | Yoshino | Jul 2010 | A1 |
20100208040 | Guillou | Aug 2010 | A1 |
20100238196 | Hinterberger et al. | Sep 2010 | A1 |
20110093778 | Kim et al. | Apr 2011 | A1 |
20110126159 | Ko et al. | May 2011 | A1 |
20110126160 | Han et al. | May 2011 | A1 |
20110221777 | Ke | Sep 2011 | A1 |
20110254846 | Lee | Oct 2011 | A1 |
20120001943 | Ishidera | Jan 2012 | A1 |
20120081359 | Lee | Apr 2012 | A1 |
20130113783 | Pourbigharaz | May 2013 | A1 |
20150172643 | Yoshino | Jun 2015 | A1 |
20150189260 | Yoshino | Jul 2015 | A1 |
Number | Date | Country |
---|---|---|
2 346 263 | Jul 2011 | EP |
2 346 264 | Jul 2011 | EP |
2 448 275 | May 2012 | EP |
2 448 276 | May 2012 | EP |
2 448 277 | May 2012 | EP |
2 456 214 | May 2012 | EP |
2 456 215 | May 2012 | EP |
2 456 216 | May 2012 | EP |
2 456 217 | May 2012 | EP |
2 456 218 | May 2012 | EP |
2006-293878 | Oct 2006 | JP |
2010-175643 | Aug 2010 | JP |
2010-257160 | Nov 2010 | JP |
2011028263 | Feb 2011 | JP |
1020110026811 | Mar 2011 | KR |
1020110054256 | May 2011 | KR |
1020110056775 | May 2011 | KR |
1020110086415 | Jul 2011 | KR |
WO 2008030005 | Mar 2008 | WO |
WO 2011123178 | Oct 2011 | WO |
Entry |
---|
European Search Report dated Aug. 13, 2015 issued in counterpart application No. 13735681.2-1903, 9 pages. |
Korean Office Action dated Jan. 29, 2018 issued in counterpart application No. 10-2012-0003352, 7 pages. |
European Search Report dated Jul. 17, 2018 issued in counterpart application No. 13735681.2-1209, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20130176301 A1 | Jul 2013 | US |