This application claims the priority of Korean Patent Application No. 10-2010-0134518 filed on Dec. 24, 2010, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a user interface apparatus and method, and more particularly, to a user interface apparatus and method using a two-dimensional (2D) image sensor capable of setting an object for an interface from a user image detected by the 2D image sensor and performing a user input accordingly.
2. Description of the Related Art
A user interface (UI) refers to a command or a technique used for general users to control a data input or an operation in a computer system or a program. An ultimate purpose of a UI is to allow users to communicate with a computer or a program to easily and conveniently use the computer or the program.
The related art computer has been designed to focus on improvements in the efficiency or rate of computation or calculation. The related art computer has been designed based upon a premise that a user knows all about a computer regarding connections between the user and the computer.
However, currently, computers are not exclusively used by some experts who know all about the computers, and not a machine which simply performs a calculation function but is increasingly utilized as a tool for upgrading users' creativity. Thus, currently, UIs in computers have been developed as tools for taking the user convenience of non-experts into consideration and improving the performance of an overall system.
An aspect of the present invention provides a user interface apparatus and method capable of providing a more practical, convenient user interface to users by using an image obtained by a two-dimensional (2D) image sensor which is currently prevalent.
According to an aspect of the present invention, there is provided a user interface apparatus including: a two-dimensional (2D) image sensor; and a user interface unit detecting a movement of a user captured by the 2D image sensor, setting the detected movement as an object, and outputting the movement of the object detected by the 2D image sensor to a system in which the user intends to establish an interface.
The user interface unit may include an interrupt determining part determining a type of user interrupt; an object setting part setting a particular area of an input image as an object and storing corresponding content, when the user interrupt is determined for setting a new object; and an object recognizing part recognizing the object from the input image based on information of the object previously set and stored, determining a command desired to be performed by the object, and outputting a user interface command signal.
The object setting part may feed back the setting of the new object to the user.
The object recognizing part may recognize the object from the input image in consideration of shape, size, texture, and color of the object previously stored.
When the movement of the object is repeated in two opposite directions, the object recognizing part may determine that one of the directions is valid.
The object recognizing part may display an image displaying a certain function performed by a user interface on a display unit, and form a user interface according to a relationship between the image displaying the certain function and a cursor by using a position of the recognized object as the cursor.
According to another aspect of the present invention, there is provided a user interface method including: determining a type of user interrupt; capturing a user image, and setting a new object from the captured user image when the user interrupt is determined for inputting the new object; and recognizing an object from an input image by using a previously stored object feature, and performing a user interface operation according to a movement of the recognized object when the user interrupt is determined for recognizing the object.
The above and other aspects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
Embodiments of the present invention will now be described in detail with reference to the accompanying drawings. The invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the shapes and dimensions of components may be exaggerated for clarity, and the same reference numerals will be used throughout to designate the same or like components.
As shown in
In the embodiment of the present invention, the user interface unit 13 may receive a user command by using a user image detected by the 2D image sensor 11. Image processing is performed by the ISP 12 on signals denoting the user image detected by the 2D image sensor 11. Through the image processing, color, chroma, brightness, and the like, of the user image signals are adjusted to enhance image quality.
The user interface unit 13 may recognize an object area indicating the user command from the input user image, interpret a command indicated by the object area, and transfer a signal corresponding thereto to the MCU 14 and the display unit 15.
With reference to
The interrupt determining part 131 may determine a type of user interrupt to determine whether to set a new object on the input image or whether to recognize a pre-set object from the input image.
When the user interrupt is determined as an interrupt for setting a new object, the object setting part 132 may set a particular area (e.g., hand, mobile phone, etc.) in the input image, as an object, and store corresponding content.
The object recognizing part 133, provided to recognize a pre-set object, may recognize an object from the input image based on information of the object previously set and stored, determine a command desired to be performed by the object, and output a user interface command signal.
As shown in
When the user interrupt is determined for inputting an object, a user image is captured (S321), and an object is set from the captured image (S322).
When the user interface operation is not terminated (S34), the process may be returned to the operation (S31) for determining a type of user interrupt.
Meanwhile, when the user interrupt is determined for recognizing an object, an object is recognized from an input image by using a pre-set object feature (S331), and the user interface operation may be performed according to a movement of the recognized object (S332).
As shown in
As shown in
Subsequently, as shown in
As shown in
Thus, in order to recognize the object even when the shape of the object is partially changed, the color, size, and texture of the object, as well as the shape and outline of the object, may be used for recognizing the object.
As shown in
As shown in
As shown in
As set forth above, according to embodiments of the invention, a movement of a user can be recognized by using the 2D image sensor, whereby the system can be stably controlled.
In addition, since the reaction of the system according to a movement of the user can be recognized in real time, a smooth interface can be obtained between the user and the system.
While the present invention has been shown and described in connection with the embodiments, it will be apparent to those skilled in the art that modifications and variations can be made without departing from the spirit and scope of the invention as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0134518 | Dec 2010 | KR | national |