This application claims priority from and the benefit of Korean Patent Application No. 10-2010-0008733, filed on Jan. 29, 2010, which is hereby incorporated by reference for all purposes as if fully set forth herein.
1. Field
Disclosed herein is a user interface using a hologram and a method thereof.
2. Discussion of the Background
Currently, a touch-type user interface that recognizes an input through an external contact is provided in a terminal, such as a lap-top, desk-top, or mobile terminal. In such a terminal, various functions are performed by recognizing a user's contact input through a touch-type user interface.
In general, a touch-type user interface may include a touch pad, touch screen, or the like, which provides a two-dimensional touch-type user interface through a screen. At this time, various virtual objects, such as icons, for user input are displayed on the screen.
If a user's contact occurs on a screen, such a touch-type user interface recognizes that a virtual object displayed at the position at which the user's contact occurs on the screen is selected by a user, and recognizes that an instruction for executing a specified function corresponding to the selected virtual object is inputted by the user. Accordingly, the user interface allows a terminal to execute the specified function corresponding to the virtual object selected by the user's contact among virtual objects displayed on the screen.
Meanwhile, a user interface that provides a three-dimensional touch-type user interface using a hologram has recently been developed as an extension of the two-dimensional touch-type user interface.
In such a user interface using a hologram, a hologram display area is displayed in an arbitrary area in a space, and various virtual objects for user input are displayed in the hologram display area. The user interface recognizes that a virtual object among the displayed virtual objects is selected by a user, and recognizes that an instruction for executing a specified function corresponding to the selected virtual object is inputted by the user. Accordingly, the user interface allows a terminal to execute the specified function corresponding to the virtual object selected by the user's contact.
However, if a contact with a displayed virtual object occurs, the user interface using the hologram recognizes that the virtual object is selected by a user, and recognizes that an instruction for executing a specified function corresponding to the selected virtual object is inputted by the user. Hence, when a real object such as a part of a user's body simply passes through a hologram display area displayed in a space, i.e., when the real object passes through the hologram display while coming in contact with a specified virtual object, the user interface recognizes that the specified virtual object is selected by the user, and recognizes that an instruction for executing a specified function corresponding to the selected virtual object is inputted by the user. Therefore, a malfunction of a terminal may be caused.
Disclosed herein is a user interface using a hologram, which displays virtual objects for user input in a space using the hologram, and recognizes the user's various inputs through the displayed virtual objects.
Also, disclosed herein is a user interface using a hologram, which can provide feedback to a user through a visual or a tactile effect.
Additional features of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
An exemplary embodiment provides a user interface, including a memory unit to store information on a shape, a function, a position, and a movement pattern for a virtual object; a hologram output unit to project a hologram display area and to display the virtual object in the projected hologram display area; a real object sensing unit to sense a real object in the hologram display area and to generate information on a position and a movement pattern of the real object; a contact recognizing unit to determine the positions and the movement patterns of the respective virtual object and real object in the hologram display area according to the information on the position and movement pattern of the real object generated by the real object sensing unit, and the information stored in the memory unit to recognize a contact between the virtual object and the real object; and a control unit to determine whether the recognized contact between the virtual object and the real object corresponds to an input for selecting the virtual object.
An exemplary embodiment provides a user interface, including a memory unit to store information on a shape, a function, a position, and a movement pattern for a virtual object; a hologram output unit to project a hologram display area and to display the virtual object in the projected hologram display area; a communication unit to receive a wireless signal transmitted from a real object that transmits the wireless signal, the wireless signal containing information; a real object sensing unit to receive the wireless signal from the communication unit, to extract the information contained in the wireless signal, and to generate information on a position and a movement pattern of the real object in the hologram display area according to the wireless signal; a contact recognizing unit to determine the positions and the movement patterns of the respective virtual object and the real object in the hologram display area according to the information on the position and the movement pattern of the real object generated by the real object sensing unit, and the information stored in the memory unit to recognize a contact between the virtual object and the real object; and a control unit to determine a function of the real object that comes in contact with the virtual object according to the information of the real object extracted by the real object sensing unit.
An exemplary embodiment provides a user interface, including a memory unit to store information on a virtual object; a hologram output unit to project the virtual object in a hologram display area; a real object sensing unit to sense a real object in the hologram display area; a contact recognizing unit to determine a contact between the real object and the virtual object according to the information on the virtual object and information on the sensed real object; and a control unit to determine whether the recognized contact corresponds to an input for selecting the virtual object.
An exemplary embodiment provides a method for a user interface, the method including displaying a virtual object in a hologram display area; determining if a contact between a real object and the virtual object occurs; determining if the contact between the real object and the virtual object corresponds to an input for selecting the virtual object; moving the selected virtual object according to a movement of the real object; and executing a function corresponding to the selected virtual object according to the movement of the selected virtual object.
An exemplary embodiment provides a method for a user interface, the method including displaying a virtual object in a hologram display area; determining if a contact between a real object and the virtual object occurs; determining a function of the real object if the contact occurs; and executing the function of the real object with respect to the virtual object.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention, and together with the description serve to explain the principles of the invention.
The invention is described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth therein. Rather, these exemplary embodiments are provided so that this disclosure will be thorough, and will fully convey the scope of this disclosure to those skilled in the art. In the description, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of this disclosure. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, the use of the terms “a”, “an”, etc. does not denote a limitation of quantity, but rather denotes the presence of at least one of the referenced item. The use of the terms “first”, “second”, and the like does not imply any particular order, but they are included to identify individual elements. Moreover, the use of the terms “first”, “second”, etc. does not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another. It will be further understood that the terms “comprises” and/or “comprising”, or “includes” and/or “including” when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In the drawings, like reference numerals denote like elements. The shape, size, and regions, and the like, of the drawing may be exaggerated for clarity.
Hereinafter, a user interface using a hologram and a method for recognizing an input of the user interface according to exemplary embodiments will be described in detail with reference to the accompanying drawings.
The memory unit 110 stores information on a shape, a function, an initial position, and an initial movement pattern for each virtual object. The information on the initial position includes a three-dimensional position coordinate and the like. The information on the initial movement pattern includes a three-dimensional position coordinate, a vector value (i.e., a movement distance, a direction, and a velocity), and the like.
The hologram output unit 120 projects a hologram display area in an arbitrary area in a space under the control of the control unit 160, and displays virtual objects in the projected hologram display area. The space in which the hologram display area is projected may be adjacent to and/or outside of the touch-type user interface 100.
The real object sensing unit 130 senses a real object that exists in the hologram display area, and generates information on a position and a movement pattern of the real object 10 (shown in
The real object sensing unit 130 may obtain the three-dimensional coordinate of the real object 10 that exists in the hologram display area using one of a capacitive touch screen method, an infrared (IR) touch screen method, an electromagnetic resonance (EMR) digitizer method, an image recognizing method, and the like.
The real object sensing unit 130 receives a wireless signal transmitted from the real object 10, and determines a distance to the real object 10 using the reception intensity of the received wireless signal. Then, the real object sensing unit 130 determines the three-dimensional position coordinate of the real object 10 using the determined distance from the real object 10 and the reception direction of the wireless signal. The real object sensing unit 130 may have a communication unit (not shown) to perform wireless communications with real object 10.
The tactile sense providing unit 140 provides an acoustic radiation pressure to the hologram display area by radiating an acoustic wave under the control of the control unit 160. As a result, the real object 10 that exists in the hologram display area is influenced by the acoustic radiation pressure provided from the tactile sense providing unit 140.
The contact recognizing unit 150 identifies, in real time, the positions and movement patterns of the respective real object 10 and virtual object in the hologram display area projected by the hologram output unit 120 using the information on the position and movement pattern of the real object 10, generated by the real object sensing unit 130, and the information stored in the memory unit 110. Thus, the contact recognizing unit 150 determines whether a contact between the virtual object and the real object 10 occurs in the hologram display area. If the contact recognizing unit 150 determines that the contact between the virtual object and the real object 10 occurs in the hologram display area, the contact recognizing unit 150 detects the contact part of the virtual object that comes in contact with the real object 10. If a part of the three-dimensional position coordinates of the respective virtual object and real object 10 that are overlapped with each other occurs in the hologram display area by identifying, in real time, the positions and movement patterns of the respective real object 10 and virtual object in the hologram display area, the contact recognizing unit 150 recognizes that the contact between the virtual object and the real object 10 occurs. The contact recognizing unit 150 may also recognize three-dimensional position coordinates corresponding to the part of the three-dimensional position coordinates of the respective virtual object and real object 10 that are overlapped with each other occurs in the hologram display area as the three-dimensional position coordinates of the contact part of the virtual object that comes in contact with the real object 10.
Meanwhile, the control unit 160 controls the hologram output unit 120 to project a hologram display area, and controls virtual objects to be displayed in the projected hologram display area. The control unit 160 controls virtual objects for providing various functions to be respectively displayed at their initial positions or to be respectively moved in their initial patterns using the information stored in the memory unit 110.
If the contact recognizing unit 150 recognizes that a contact between the virtual object and the real object 10 occurs in the hologram display area, the control unit 160 determines whether the contact between the virtual object and the real object 10 is an input for virtual object selection. As a result, if the control unit 160 determines that the contact between the virtual object and the real object 10 is an input for selecting the virtual object, the control unit 160 detects a function of the virtual object that comes in contact with the real object 10 by searching for the information stored in the memory unit 110, and recognizes that an instruction for executing the detected function is inputted.
If the contact recognizing unit 150 recognizes that a contact between the virtual object and the real object 10 occurs in the hologram display area, the control unit 160 determines that the contact between the virtual object and the real object 10 is the selection of the virtual object. As a result, if the control unit 160 determines that the contact between the virtual object and the real object 10 is an input for selecting the virtual object or an input for canceling the virtual object, the control unit 160 controls the hologram output unit 120, thereby changing a color or a shape of the virtual object that comes in contact with the real object 10. Accordingly, a user can visually identify whether the virtual object is selected. The control unit 160 controls the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area. As a result, when the real object 10 corresponds to a part of a user's body, the user can identify via tactile sense whether the virtual object is selected.
When the real object 10 comes in contact with the virtual object for longer than a reference time or when the real object 10 simultaneously comes in contact with a plurality of markers that exist at parts of the virtual object, the control unit 160 may determine that the contact between the virtual object and the real object 10 is an input for selecting the virtual object. The reference time may be predetermined or selectable.
If it is determined that the contact between the virtual object and the real object 10 is an input for selecting the virtual object, the control unit 160 traces, in real time, the movement of the real object 10 in the hologram display area using the information on the movement pattern of the real object 10 generated by the real object sensing unit 130. The control unit 160 determines whether the real object 10 that contacts the virtual object is out of the hologram display area, i.e., a range sensed by the real object sensing unit 130. If control unit 160 determines that the real object 10 is out of or exits the range or that the contact of the real object 10 is released from one of the plurality of markers with which the real object 10 simultaneously comes in contact, the control unit 160 determines that the input for selecting the virtual object is cancelled, and controls the hologram output unit 120 to change the color or the shape of the virtual object that comes in contact with the real object 10. The control unit 160 also controls the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
If it is determined that the contact between the virtual object and the real object 10 is an input for selecting the virtual object, the control unit 160 traces, in real time, the movement of the real object 10 in the hologram display area using the information on the movement pattern of the real object 10 generated by the real object sensing unit 130. The control unit 160 also controls the hologram output unit 120 to allow the virtual object that comes in contact with the real object 10 to be moved corresponding to the movement of the real object 10. Based on the movement of the virtual object, the control unit 160 recognizes that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the execution of the specified function is inputted by the user.
For example, the control unit 160 controls the hologram output unit 120 to rotate the virtual object based on the rotational movement of the real object 10 that comes in contact with the virtual object or to drag the virtual object to the movement position of the real object 10 based on the movement of the real object 10 that comes in contact with the virtual object. Based on the rotating or dragging position of the virtual object, the control unit 160 recognizes that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the execution of the specified function is inputted by the user. For example, if the virtual object is rotated at an arbitrary angle in an arbitrary direction; if the virtual object is dragged to the position at which an arbitrary virtual object, such as an icon, for providing an executing or canceling function displayed in the hologram display area; or if an arbitrary virtual object such as an icon for providing an executing or canceling function is dragged to the position at which the virtual object to be executed or cancelled is displayed in the hologram display area, the control unit 160 may recognize that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the specified function is inputted by the user.
If the movement pattern of the real object 10 is matched to a specified movement pattern using the information on the movement pattern of the real object 10 generated by the real object sensing unit 120, the control unit 160 may recognize that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the specified function is inputted by the user.
If the control unit 160 recognizes that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the specified function is inputted by the user, the control unit 160 controls the hologram output unit 120 to change the color or the shape of the hologram display area or virtual object displayed in the hologram display area. The control unit 160 may control the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
Hereinafter, a method for recognizing an input in the user interface using the hologram according to an exemplary embodiment will be described with reference to
First, the user interface 100 using the hologram projects a hologram display area in a space, and displays virtual objects in the projected hologram display area (S200).
If a contact between a real object 10 and one of the virtual objects displayed in operation S200 occurs (S210), the control unit 160 determines whether the contact between the virtual object and the real object 10 corresponds to an input for selecting the virtual object (S220).
When it is determined in operation S220 that the contact between the virtual object and the real object 10 corresponds to the input for selecting the virtual object, the control unit 160 controls the hologram output unit 120 to change a color or a shape of the virtual object that comes in contact with the real object 10. Then, the control unit 160 controls the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
In operation S220, if the real object 10 comes in contact with the virtual object for longer than a reference time or if the real object 10 simultaneously comes in contact with a plurality of markers that exist at parts of the virtual object, the control unit 160 determines that the contact between the virtual object and the real object 10 corresponds to the input for selecting the virtual object. For example, if a user's finger, i.e., a real object 10, comes in contact with an icon having an executing function, i.e., a virtual object, for longer than a reference time as illustrated in
If it is determined in operation S220 that the contact between the virtual object and the real object 10 is the input for selecting the virtual object, the control unit 160 traces, in real time, the movement of the real object 10 in the hologram display area, and controls the hologram output unit 120 to allow the virtual object that comes in contact with the real object 10 to be moved corresponding to the movement of the real object 10 (S230). Based on the movement of the virtual object, the control unit 160 recognizes that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the execution of the specified function is inputted by the user (S240).
At the operation S240, if the icon, i.e., the virtual object, is rotated at an arbitrary angle in an arbitrary direction at the operation S230 as illustrated in
If it is recognized in operation S240 that the instruction for executing the specified function is inputted by the user or that the instruction for canceling the execution of the specified function is inputted by the user, the control unit 160 may control the hologram output unit 120 to change a color or a shape of the hologram display area or the virtual object displayed in the hologram display area. Then, the control unit 160 may control the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
When it is determined at the operation S220 that the contact between the virtual object and the real object 10 is the input for selecting the virtual object, the control unit 160 traces the movement of the real object 10 in the hologram display area, and determines whether the real object 10 that comes in contact with the virtual object is out of or exits the hologram display area, i.e., a range for tracing the movement of the real object 10. If it is determined that the real object 10 is out of or exits the range or that the contact of the real object 10 is released from one of the plurality of markers with which the real object 10 simultaneously comes in contact, the control unit 160 determines that the input for selecting the virtual object is cancelled, and controls the hologram output unit 120 to change the color or the shape of the virtual object displayed in the hologram display area. Then, the control unit 160 may control the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
The memory unit 110 stores information on a shape, a function, an initial position and an initial movement pattern for each virtual object. The information on the initial position includes a three-dimensional position coordinate and the like. The information on the initial movement pattern includes a three-dimensional position coordinate, a vector value (i.e., a movement distance, a direction and a velocity), and the like.
The hologram output unit 120 projects a hologram display area in an arbitrary area in a space under the control of the control unit 160, and displays virtual objects in the projected hologram display area.
If a wireless signal is received in the communication unit 170 from a real object 10a and/or 10b that exists in the hologram display area, the real object sensing unit 130 extracts functional information of the real object 10a and/or 10b contained in the received wireless signal and generates information on the position and movement pattern of the real object 10a and/or 10b in the hologram display area using the received wireless signal. Then, the real object sensing unit 130 provides the generated information to the control unit 160.
The real objects 10a and 10b may include different functions. For example, the real object 10a may include or represent a function of inputting a selection, a function of inputting an execution instruction, or the like, and the real object 10b may include or represent a function of inputting the cancellation of a selection, a function of inputting a cancellation instruction, or the like. The real objects 10a and 10b may be a small-size device having a function of transmitting a wireless signal containing information on the included function. The small-size device may be formed in a shape attachable to a user's finger.
The communication unit 170 performs wireless communications with the real objects 10a and/or 10b. The communication unit 170 receives a wireless signal transmitted from the real object and provides the received wireless signal to the real object sensing unit 130. For example, the communication unit 170 may include a directional antenna module (not shown) or the like.
The real object sensing unit 130 receives from the communication unit 170 a wireless signal transmitted from the real object 10a and/or 10b that exists in the hologram display area, and the real object sensing unit 130 determines a distance to the real object 10a and/or 10b using the reception intensity of the received wireless signal. Then, the real object sensing unit 130 obtains the three-dimensional position coordinate of the real object 10a and/or 10b that transmits the wireless signal in the hologram display area using the determined distance from the real object 10a and/or 10b and the reception direction of the wireless signal, and the real object sensing unit 130 generates information on the position of the real object 10a and/or 10b using the obtained three-dimensional position coordinate. The real object sensing unit 130 calculates a vector value based on a change in the position of the real object 10a and/or 10b using a change in the three-dimensional position coordinate of the real object 10a and/or 10b based on the change in the position of the real object 10, and the real object sensing unit 130 generates information on the movement pattern of the real object 10a and/or 10b using the calculated vector value.
The tactile sense providing unit 140 provides an acoustic radiation pressure to the hologram display area by radiating an acoustic wave under the control of the control unit 160. As a result, the real object 10a and/or 10b that exists in the hologram display area is influenced by the acoustic radiation pressure provided from the tactile sense providing unit 140.
The control unit 160 controls the hologram output unit 120 to project a hologram display area, and controls virtual objects to be displayed in the projected hologram display area. The control unit 160 controls virtual objects for providing various functions to be respectively displayed at their initial positions or to be respectively moved in their initial patterns using the information stored in the memory unit 110.
The contact recognizing unit 150 identifies, in real time, the positions and movement patterns of the respective real object 10a and/or 10b and virtual object in the hologram display area projected by the hologram output unit 120 using the information on the position and movement pattern of the real object 10a and/or 10b generated by the real object sensing unit 130, and the information stored in the memory unit 110. Thus, the contact recognizing unit 150 determines whether a contact between the virtual object and the real object 10a and/or 10b occurs in the hologram display area. If a part of the three-dimensional position coordinates corresponding to shapes of the respective virtual object and real object 10a and/or 10b are overlapped in the hologram display area, the contact recognizing unit 150 recognizes that the contact between the virtual object and the real object 10a and/or 10b occurs.
If the contact recognizing unit 150 recognizes that a contact between the virtual object and the real object 10a and/or 10b occurs in the hologram display area, the control unit 160 identifies the function of the real object 10a and/or 10b that comes in contact with the virtual object using the functional information of the real object 10a and/or 10b extracted by the real object sensing unit 130, and recognizes that the contact between the virtual object and the real object 10a and/or 10b is a user's input based on the identified function of the real object 10a and/or 10b. The controller 160 may determine whether the contact between the virtual object and the real object 10a and/or 10b corresponds to an input for selecting the virtual object or an input for canceling the selection, whether the contact between the virtual object and the real object 10a and/or 10b corresponds to an instruction for executing an arbitrary function or an instruction for canceling the execution of the arbitrary function, or the like.
If it is determined that the contact between the virtual object and the real object 10a and/or 10b corresponds to the input for selecting the virtual object or the input for canceling the selection, the control unit 160 may control the hologram output unit 120 to change a color or a shape of the virtual object that comes in contact with the real object 10a and/or 10b. The control unit 160 may also control the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
If it is determined that the contact between the virtual object and the real object 10a and/or 10b is an input for selecting the virtual object, the control unit 160 traces, in real time, the movement of the real object 10a and/or 10b in the hologram display area using the information on the movement pattern of the real object 10a and/or 10b generated by the real object sensing unit 130. Then, the control unit 160 controls the hologram output unit 120 to allow the virtual object that comes in contact with the real object 10a and/or 10b to be moved corresponding to the movement of the real object 10a and/or 10b.
If it is recognized that an instruction for executing an arbitrary function is inputted by a user or that an instruction for canceling the arbitrary function is inputted by the user, the control unit 160 controls the hologram output unit 120 to change the color or the shape of the hologram display area or the virtual object displayed in the hologram display area. The control unit 160 may control the tactile sense providing unit 140 to provide an acoustic radiation pressure to the hologram display area.
A method for recognizing an input in the user interface using the hologram according to an exemplary embodiment will hereinafter be described with reference to
First, the user interface 200 using the hologram projects a hologram display area in a space, and displays virtual objects in the projected hologram display area (S300).
If it is determined that a contact between a real object 10a and/or 10b and one of the virtual objects displayed in operation S300 occurs (S310), the control unit 160 identifies a function of the real object 10a and/or 10b that comes in contact with the virtual object (S320), and recognizes that the contact between the virtual object and the real object 10a and/or 10b is a user's input based on the identified function of the real object 10a and/or 10b (S330).
At the operation S330, the control unit 160 may determine whether the contact between the virtual object and the real object 10a and/or 10b corresponds to an input for selecting the virtual object or an input for canceling the selection, or the control unit 160 may determine that an instruction for executing a specified function is inputted by a user or that an instruction for canceling the execution of the specified function is inputted by the user.
If it is determined in operation S330 that the contact between the virtual object and the real object 10a and/or 10b corresponds to the input for selecting the virtual object, the control unit 160 traces, in real time, the movement of the real object 10a and/or 10b, and controls the hologram output unit 120 to allow the virtual object that comes in contact with the real object 10a and/or 10b to be moved corresponding to the movement of the real object 10a and/or 10b.
If it is determined at the operation S330 that the contact between the virtual object and the real object 10a and/or 10b corresponds to the input for selecting the virtual object or the input for canceling the selection, the control unit 160 controls the hologram output unit 120 to change the color or the shape of the virtual object that comes in contact with the real object 10a and/or 10b. For example, at the operation S330, if a real object 10a having a function of inputting a selection comes in contact with a virtual object as illustrated in
For example, at the operation S330, if a real object 10a having a function of inputting an execution instruction comes in contact with a virtual object as illustrated in
The user interface using the hologram, disclosed herein, is not limited to the aforementioned embodiments but may be variously modified within the scope allowed by the technical spirit disclosed herein.
According to a user interface using a hologram disclosed herein, virtual objects for user input are displayed in a space using a hologram, and a user's input is recognized through the displayed virtual objects.
Also, according to a user interface using a hologram disclosed herein, as a user's input is recognized, the recognition of the user's input is fed back to a user through a visual or tactile effect.
It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0008733 | Jan 2010 | KR | national |