This application claims priority under 35 U.S.C. §119 to Korean Patent Application No. 10-2010-0125837, filed on Dec. 9, 2010, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
The present invention relates to a mixed reality display platform for presenting an augmented 3D stereo image and an operation method thereof, and more particularly, to a mixed reality display platform for presenting an augmented 3D stereo image capable of presenting a natural 3D image in 3D space around a user using a plurality of 3D image display devices and an operation method thereof.
Most 3D image presenting technologies, which have been popularized in movie and TV fields, use binocular disparity (a difference between images in which a 3D object of a human external environment is formed on retinas of both eyes) effects. However, this method presents an image having virtual depth perception in front and rear spaces of an image outputting surface to a user by outputting binocular disparity information to the image outputting surface which is spaced apart by a fixed distance, such as an LCD screen, which has a fundamental disadvantage of causing significant fatigue in a human visual movement structure.
In addition, an interactive hologram display technology which is a 3D image presenting technology presented in contents such as a movie is an ideal display technology that completely accepts a human stereo vision perception characteristic, but the implementation is a long way off in a current technological level, which leads general consumers' misunderstanding of a 3D image technology and disappointment to a current technology.
In the present invention, various homogeneous and heterogeneous 3D image display devices divide and share a physical space for expressing a 3D image and real-time contents information is generated based on user information and information on the divided space to display the generated real-time contents information together using various 3D image display devices.
An exemplary embodiment of the present invention provides a mixed reality display platform for presenting an augmented 3D stereo image, including: an input/output controller controlling a plurality of display devices including at least one 3D display device, which are associated with each other; an advance information manager establishing a 3D expression space for each display device to divide or share a physical space for expressing a 3D stereo image by collecting spatial establishment of the display device for each display device; and a real-time information controller generating real-time contents information using user information including information on binocular 6 degree-of-freedom, a gaze direction, and focusing information of a user and 3D contents for a virtual space, wherein the input/output controller distributes the real-time contents information to the display device on the basis of the 3D expression spatial information established for each display device and the user information.
Another exemplary embodiment of the present invention provides a mixed reality display platform for presenting an augmented 3D stereo image, including: an input/output controller controlling a plurality of display devices including at least one 3D display devices, which are associated with each other; an advance information manager including a space establishment collecting unit collecting information on an optimal 3D space which is expressible by the display device, a virtual space 3D contents database storing 3D contents for the virtual space, an authoring unit authoring information of a physical space collected by the space establishment collecting unit and information of the virtual space as an inter-placement relationship in a 3D space, and an optimal space establishment information database storing the authoring result as optimal 3D expression space establishment information for each display device; and a real-time information controller including a user information extracting unit extracting user information, a multi-user participation supporting unit managing an interrelationship of a plurality of users when the user is multiple, a real-time contents information generating unit generating real-time contents information on the basis of the user information, the interrelationship of the plurality of users, and the 3D contents for the virtual space, and a user adaptive device and image parameter controlling unit managing the user information and modifying optimal 3D expression space establishment information for each display device on the basis of personal information of the user which is collected in advance.
Yet another exemplary embodiment of the present invention provides an operation method of a mixed reality display platform for presenting an augmented 3D stereo image, including: collecting information on an optimal 3D space which is expressible from a plurality of display devices including at least one 3D display device; establishing a 3D expression space for each display device to divide or share a physical space for expressing a 3D stereo image for each display device on the basis of the collected information on the optimal 3D space; collecting user information including binocular 6 degree-of-freedom information, a gaze direction, and focusing information of a user; generating real-time contents information using 3D contents for a virtual space and the user information; and distributing the real-time contents information to each display device on the basis of the user information and the 3D expression spatial information established for each display device.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings. Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience. The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be suggested to those of ordinary skill in the art. Also, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
Hereinafter, a mixed reality display platform for presenting an augmented 3D stereo image and an operation method thereof according to exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
As shown in
The advance information manager 100 establishes the relationship between hardware components and software components in advance and stores and manages the components in a database structure in order to configure one integrated virtual stereo space which is finally completed. To this end, the advance information manager 100 includes an input/output device space establishment collecting unit 110, a device optimal 3D expression space establishment information database 120, a 3D image expression space dividing/sharing establishment authoring unit 130, and a virtual space 3D contents database 140.
The virtual space 3D contents database 140 represents a database storing virtual reality and mixed reality software contents and includes model data for a 3D space, i.e., geographical features, natural features, environments, and objects which become interaction targets.
Using the data stored in the virtual space 3D contents database 140, in the mixed reality display platform for presenting the augmented 3D stereo image according to the exemplary embodiment of the present invention, a virtual reality space constituted by only virtual objects may be presented to a user or a mixed reality system in which a service scenario is performed through the interaction of digital contents objects of a virtual space, and real users and objects may be implemented.
The input/output device space establishment collecting unit 110 acquires information on an optimal 3D space which can be expressed by a predetermined display which may be included in the mixed reality display platform for presenting the augmented 3D stereo image from each of display devices 320, 330, and 340 according to the exemplary embodiment of the present invention. In this case, the display devices include a common display device, a portable (mobile) display device, and a personal wearable display device and a 3D space which can be expressed by each display device includes a volume of public screen (VPS), a volume of mobile screen (VMS), and a volume of personal virtual screen (VpVS).
The input/output device space establishment collecting unit 110 collects as information on a user's surrounding environment installation status information of input sensor devices (e.g., image input devices such as a camera, and the like, input devices based on positional and acceleration sensors, and the like) and information outputting devices (a sound effect outputting device, a mono display device, and the like) other than a 3D display device that are installed in a physical space. The installation status information of the input sensor devices and the information outputting devices may include 6 degree-of-freedom (e.g., 3 positions—x, y, and z and 3 poses—pitch, yaw, and roll) information and control related time information.
The input/output device space establishment collecting unit 110 provides the collected information to the 3D image expression space dividing/sharing establishment authoring unit 130.
The 3D image expression space dividing/sharing establishment authoring unit 130 as a 3D contents modeling tool provides a function to author physical spatial information provided by the input/output device space establishment collecting unit 110 and virtual spatial information stored in the virtual space 3D contents database 140 as an inter-arrangement relationship in a 3D space on the basis of a GUI. This is an operation for placing a zone which each 3D display device takes charge of in a 3D space model. The responsible zone may be manually adjusted by the user or automatically placed so that each display device appropriately shares and divides the 3D space at a predetermined numerical value by receiving minimum-appropriate-dangerous-maximum zone information (e.g., the depths of positive and negative parallaxes, and the like) from the corresponding 3D display device.
As described above, when a spatial relationship of appropriate virtual information which can be expressed by each display device is defined, initial establishment information for the defined spatial relationship is stored in the device optimal 3D expression space establishment information database 120.
The real-time information controller 200 extracts information on a single user or a plurality of users that participate every moment of operating the entire system to change a parameter set as an initial value in order to present a natural 3D space. The user information may include information on 6 degree-of-freedom (DOF) associated with a vision of each of the both eyes of a user, a view vector, and focusing information and may also include information on what types of input/output devices and sensors the user can interact at present. The real-time information controller 200 includes a user adaptive device and image parameter controlling unit 210, a user information extracting unit 220, a multi-user participation support controlling unit 230, and a real-time contents information generating unit 240.
The user information extracting unit 220 accurately tracks which space the user observes at present on the basis of 6 degree-of-freedom (position and pose) associated with the vision of each of the both eyes of the user, the view vector, and the focusing information to transfer related information to the user adaptive device and image parameter controlling unit 210 so that the display device capable of best expressing the 3D stereo effect of the corresponding space among the plurality of display devices processes information on the corresponding user.
Information on what types of input/output devices and sensors the user interacts with at present is collected to be transferred to the user adaptive device and the image parameter controlling unit 210, such that the system can process an operation of dividing and presenting various multimodal input/output information to individual users.
The multi-user participation support controlling unit 230 processes a situation in which the plurality of users use the mixed reality display platform for presenting the augmented 3D stereo image in one physical space. In this case, more than one of mixed reality display platforms for presenting the augmented 3D stereo image are present. Therefore, the multi-user participation support controlling unit 230 collects a current interaction state of the plurality of users (situational information on an action of observing the virtual space or interaction) to share virtual object information or distributive process information which each user can experience.
The user adaptive device and image parameter controlling unit 210 takes charge of adjusting a range of a partial value of information dividing and sharing processing condition value of each display device which is set as an initial value by the advance information manager 100, according to a user's personal physical and perceptive features and personal preference will. That is, since there may be a slight variation in a region to naturally feel a 3D effect by the personal physical and perceptive features, a transition boundary region of information is adjusted among 3D spaces (e.g., VPS, VpVS, VMS, and the like) by personalized advance information.
The real-time contents information generating unit 240 generates real-time contents information which is a final output result by processing an interaction event associated with the progression of service contents on the basis of information of the virtual space 3D contents database 140 and a user input acquired from the user information extracting unit 220 and the multi-user participation support controlling unit 230 and transfers the generated real-time contents information to the virtual object information input/output controlling unit 310 among multiple 3D stereo paces of the input/output platform 300.
The input/output platform group 300 includes various display devices 320, 330, and 340 and the controlling unit 310 for controlling the display devices.
The object information input/output controlling unit 310 among the multiple 3D stereo space separates the dividing and sharing information of the output result of the real-time contents information generating unit 240 on the basis of a multi-user condition and a personal optimized condition to transmit the separated information to each of the input/output devices 320, 330, and 340.
For convenience of description, in
In
In the mixed reality display platform for presenting the augmented 3D stereo image according to the exemplary embodiment of the present invention, the respective display devices 320, 330, and 340 receive user information, current interaction states of the multi-users, real-time contents information, and the like corresponding to the respective display devices 320, 330, and 340 from the object information input/output controlling unit 310 among the multiple 3D stereo spaces to present an appropriate 3D image using the same. In this case, each display device may include a display device including a visual interface device for presenting the mixture of multiple 3D images disclosed in Korean Patent Application Laid-Open No. 10-2006-0068508 or a face wearable display device for a mixed reality environment disclosed in Korean Patent Application Laid-Open No. 10-2008-0010502.
In the mixed reality display platform for presenting the augmented 3D stereo image according to the exemplary embodiment of the present invention, the units other than the display devices 320, 330, and 340 may be implemented through one apparatus 10 such as a computer and as necessary, the units may be implemented by two or more apparatuses or in a form in which some units are included in the display device. For example, when a predetermined common display device operates as a main display device, constituent members of the mixed reality display platform for presenting the augmented 3D stereo image may be implemented in the main display device.
In general, both left and right eyes sense 3D spatial information as an image with a visual disparity (d), which is independent and projected to retinas in a 2 dimension and a brain perceives a 3D stereo and a 3D object (see
A 3D display technology using the above principle presents two left-right images onto one physical and optical screen and uses a technology (e.g., a polarizing filter) of separating the images to be independently transferred to the left and right eyes.
Herein, VOp represents a virtual object in a positive parallax area, VOz represents a virtual object in a zero parallax area, VOn represents a virtual object in a negative parallax area, and Dp and Dn represent depths of positive parallax and negative parallax, respectively. RP represents a real point, VP represents a virtual point, and d represents a distance (zero parallax) on the screen.
In
That is, assuming that visual fields of both left and right eyes of a general viewer are 90 degrees, a virtual object VO (e.g., the ball) is included within a visual field range (EFOV), but deviates from an image expressible space (VV) defined based on the viewer's gaze and a physical screen (PS), such that the virtual object is present in an area which is not actually viewed by the viewer.
That is, a situation in which a video image theoretically exits in a space where the video image can be expressed only by a hologram space display device is drawn.
In all 3D image systems using a binocular vision type information display based on the existence of an image outputting surface (e.g., an LCD screen), a section of a comfortable depth feeling which the user feels is formed in a limited space on the basis of a physical and optical image surface. Therefore, an output to a deeper, wider, and higher space which virtual contents (e.g., a 3D image medium) intend to express has a limit by an existing technology. For example, a space which cannot be expressed physically and optically as a part that deviates from a field of view (FOV) defined from the viewer's viewpoint and an image expression surface is an area which the user cannot perceive or causes high visual fatigue to the user by setting an excessive image disparity.
Contrary to this, in the mixed reality display platform for presenting the augmented 3D stereo image according to the exemplary embodiment of the present invention, a limit in expressing a 3D spatial feeling can be overcome by dividing and sharing a virtual 3D expression space using multiple and plural 3D display devices.
Hereinafter, a method for expressing the 3D image implemented by the mixed reality display platform for presenting the augmented 3D stereo image according to the exemplary embodiment of the present invention will be described based on various application examples.
The exemplary embodiment of
As shown in
In general, the 3D effect of the faraway feeling using the positive parallax is closer to a distance (e.g., an IPD-inter pupil distance) between visions of both left and right eyes of the viewer as the distance increases and a depth is perceived by another factor such as an overlap rather than a binocular disparity effect as the distance increases in the light of a human 3D space perception characteristic. However, when the negative parallax in which the object becomes close to the front of the viewer's eyes is used, an absolute value of the binocular disparity (d) increases to infinity, thereby causing the extreme visual fatigue feeling. Therefore, in one screen, as a natural 3D image expressing space, a limited space in which a distance value Dp of a positive area is larger than a distance value Dn of a negative area may be defined. Herein, since the expression of a faraway object becomes a completely parallel vision, the positive parallax area is theoretically infinite, but is limited to a predetermined area in consideration of the visual fatigue feeling.
According to the exemplary embodiment of the present invention, the limit in expressing the 3D spatial feeling described above can be overcome by displaying the 3D image through a plurality of virtual screens using a plurality of 3D display devices.
As shown in
As shown in
In
The application example presented in
As shown in
The application example presented in
That is, as shown in
As shown in
As shown in
As an implementable scenario similar thereto, the user experiences virtual wearing and virtual placement of wearable clothes, accessories, and home interior products and may receive a help in deciding to purchase the advertised products.
According to exemplary embodiments of the present invention, a 3D image naturally expressed in a space with more depth, more width, and more height can be presented by overcoming a limit in expressing a limitative 3D spatial effect using one 3D display device. Since various 3D stereo contents services that overcomes a limitation of expression of the spatial effect can be provided using a 3D image technology, the services can be used in implementing virtual reality and mixed reality systems of various fields such as home appliances, education, training, medical, and military fields based on a 3D display platform.
A number of exemplary embodiments have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0125837 | Dec 2010 | KR | national |