The present invention claims priority of Korean Patent Application No. 10-2010-0034644, filed on Apr. 15, 2010, and No. 10-2010-0132490, filed on Dec. 22, 2010, which are incorporated herein by reference.
The present invention relates to a user interface, and more particularly, to a user interface device and a method for recognizing a user interaction using the same.
In line with the recent development of technology, as a small-sized projector and camera are mounted on a mobile device, the small-sized projector and camera is becoming more and more applicable.
In addition, a prototype projection system for providing various services is developed in such a manner that a user can wear a small-sized projector and camera around the neck or on a shoulder and a wearable projection system which can be carried portably is being developed as well.
Meanwhile, for efficient user interaction in a mobile environment, it is needed to project an image on the palm or on a flat table and do interactions by using a finger or tool on the projected image. In order to perform these processes in a mobile embedded system, a low computational recognition technique is inevitably required.
To achieve the improved performance of interaction recognition through recognition of a user's posture, a tool such as a color marker has been physically worn on a hand or finger of the user. However, this causes an inconvenient for the user to carry the color marker.
To overcome this inconvenience, a technology for interaction with bare hands is also being developed. However, this technology involves recognizing a user interaction by processing a color image captured with a camera. In this case, a high degree of computation is needed to identify the user interaction performed on the color images, thereby requiring a long time to recognize the user interaction, and an embedded system fails to provide a fast response time. In particular, the recognition of a touch operation with a bare finger or tool on an image projected onto a surface of a palm or table is a very difficult technology that requires a large amount of computation.
Moreover, an image projected by the projector should be well matched in brightness, color, focus, and the like, and should be kept without distortion so that a high-quality user interaction recognition can be achieved. To this end, it is necessary to adjust the brightness, color, and focus and precisely match the projected image to a particular space, such as a screen.
In view of the above, the present invention provides a user interface device and a method for recognizing a user interaction using the same, which allow the user interaction to be processed fast by a low computation.
Further, the present invention provides a user interface device and a method for recognizing a user interaction using the same, which allow fast adjustment of the brightness, color, and focus of an image projected for user interaction.
In accordance with an aspect of the present invention, there is provided a user interface device, which includes: a frame replacement unit configured to replace a frame of an input image signal by a pattern frame at a frame time; a projector module configured to project an image of the image signal with the pattern frame onto a target; an image acquisition unit configured to capture a pattern image of the pattern frame from the image projected onto the target; and an image recognition unit configured to recognize a user interaction using the captured pattern image.
In accordance with a second aspect of the present invention, there is provided a user interface device, which includes: a projector module configured to project an image onto a target; a pattern image generator configured to generate a laser beam having a pattern through diffraction to project a pattern image thereof onto the target; an image acquisition unit, in synchronization with the projection of the pattern image, configured to capture the pattern image projected onto the target on which the image is projected; and an image recognition unit configured to recognize a user interaction by using the captured pattern image.
In accordance with a third aspect of the present invention, there is provided a user interface device, which includes: a projector module configured to project an image onto a target; a pattern image generator configured to generate a laser beam having a pattern through diffraction to project a pattern image thereof onto the target; an image acquisition unit, in synchronization with the projection of the pattern image, configured to capture the pattern image projected onto the target on which the image is projected; and an image recognition unit configured to recognize a user interaction by using the captured pattern image.
The above and other objects and features of the present invention will become apparent from the following description of embodiments, given in conjunction with the accompanying drawings, in which:
The advantages and features of the present invention will be clearly understood from the following embodiments taken in conjunction with the accompanying drawings. In the drawings, like or similar reference numerals denote like or similar elements throughout the specification.
As shown, the user interface device includes a frame replacement unit 100, an image correction unit 110, a projector module 200, a light source controller 300, an optical controller 400, an image acquisition unit 500, a synchronization unit 600, an image recognition unit 700 and a target 900.
The frame replacement unit 100 replaces a frame of an input image signal by a pattern frame at a frame time. The image signal with the pattern frame is provided to the projector module 200. The projector module 200, which may be implemented with a projector, projects an image of the image signal with the pattern frame with the pattern image onto the target 900.
In the embodiment of the present invention, the target may include but not limited to a flat surface of palm, paper, book, screen and the like.
The user may touch the image projected onto the target 900 with a finger or tool to generate a user interaction for controlling a machine.
The image acquisition unit 500, which may be implemented with a still camera or an IR (Infrared ray) camera, captures a pattern image of the pattern frame from the image projected onto the target 900 at the frame time. The pattern image captured by the image acquisition unit 500 is provided to the image recognition unit 700.
Meanwhile, the synchronization unit 600 performs frame synchronization of the projector module 200 and the image acquisition unit 500 so that the image acquisition unit 500 can acquire the pattern image from the image projected onto the target 900 in synchronization with the frame time. The frame time may be a fixed time interval or a random time interval.
The image recognition unit 700 recognizes the user interaction from the pattern image captured by the image acquisition unit 500, such as a motion event of the user's finger or of a tool. Further, the image recognition unit 700 detects brightness, color and distortion of the pattern image. The detected brightness, color and distortion of the pattern image are provided to the image correction unit 110. In addition, the interaction recognition unit 700 detects a defocus of the pattern frame. Likewise, the detected defocus is provided to the optical controller 400.
The optical controller 400 controls the focus, pan, and/or tilt of the projector module 200 depending on the detected defocus to correct the focus of the image to be projected onto the target 900. Upon perceiving the detected brightness, color and distortion, the image correction unit 110 corrects the brightness, color and distortion of the image to be projected onto the target 900. The light source controller 300 controls ON/OFF switching of a light source of the projector module 200.
Alternatively, the pattern image may be, as shown in
Referring to
As stated above, the pattern image may be a structured light and a general-purpose image in unicolor, color, or IR. The unicolor and color images have the advantage that they can be used to correct the brightness, color and distortion of the pattern image and can be utilized for a variety of RGB projectors. In addition, the unicolor and color images have the advantage of facilitating to adjust a color easily distinguish from the background.
Meanwhile, in order to prevent quality degradation of the image projected onto the target, the pattern image should be invisible to the user's eye, and thus there may be a limit on the number of pattern images to be replaced. However, if it is desired to increase the quality of the image projected onto the target and the recognition of the user interaction even though a number of pattern frames are substituted for the image frames, high-speed frame replacement technique of the pattern frames and high-speed image capturing technique may be employed.
In a case of using the IR image, it makes an image processing easier because only the IR image can be obtained by an IR camera when capturing the pattern image from the image projected on the target, and thus there is hardly any limit on the number of IR images to be replaced since they are invisible to a human eye. However, the IR image is not available for color and brightness correction, and is merely used in an IR projector having an IR light source.
According to the present invention, even if a pattern image is captured only at the frame time, it is possible to recognize a user interaction, and therefore, the amount of computation for recognition of the user interaction can be reduced.
As such, the image recognition unit 700 perceives the change in the grid pattern of the pattern images, acquires information on coordinates of the changed grid pattern corresponding to the position of the finger, and thus identifies the user interaction. Based on the identification, the image recognition unit 700 is able to recognize a touch, a drag, a release and the like of the finger. The user interaction so recognized can be used as a user input to control a machine such as a computing system.
Conventionally, there were deviations of recognition rates depending on skin color or surrounding environment when an event of a hand or finger motion was recognized from an image acquired by a camera. In the present invention, however, only the pattern image is captured from the image projected onto the target and the event of the hand or finger motion is identified by detecting a change in the grid pattern of the captured pattern image, which is less affected on skin color or surrounding environment. Thus, deviations of recognition rates are reduced and stable recognition results are achieved.
In
In
A unicolor pattern image or a color pattern image can be used in the mobile projection system of
In
First, in step S801, the user interface device is initialized, and frame synchronization of the projector module 200 and the image acquisition unit 500 is performed by the synchronization unit 600.
In step S803, the image acquisition unit 500 checks whether it is a frame time. As a result of checking, if it is the frame time, the method proceeds to step S811 in which the image acquisition unit 500 captures the pattern image from the image projected onto the target 900.
Next, it is determined whether the pattern image is a unicolor/color pattern image in step S813. If the pattern image is determined to be a unicolor/color pattern image, the method proceeds to step S831 through a tab ‘C’, and otherwise, the method goes to step S815.
Thereafter, in step S831, the pattern image is undergone an image processing, and the brightness and color of the pattern image is detected by the image recognition unit 700. The detected brightness and color of the pattern image is then provided to the light source controller 300, and the method then proceeds to step S835.
Meanwhile, if the pattern image is determined to be an IR pattern image in step S815, the method proceeds to step S821 through a tab ‘D’ where the IR pattern image is subjected to an image processing.
In step S835, the image recognition unit 700 recognizes a user interaction through the pattern image. For instance, the user interaction can be recognized by detecting the change in the grid pattern of the pattern image caused by an event, such as a user's finger motion.
Further, the focus and distortion of the pattern image are also detected from the pattern image in respective steps S837 and S839, and the detected focus and distortion are provided to the optical controller 400 and the image correction unit 110. Thereafter, the method returns to step S803 through a tab ‘E’.
In step S803, it is determined that now is not the frame time, the method goes to step S841. In step S841, the image correction unit 110 corrects the brightness and color of the image to be projected onto the target 900.
Next, in step S842, under the control of the optical controller 400, the projector module 200 is controlled depending on the detected focus to correct the focus of the image to be projected onto the target 900. In addition, in step S843, the image correction unit 110 corrects the distortion of the image to be projected onto the target 900 depending on the detected distortion. After that, the method advances to step S845 through a tab ‘B’.
Subsequently, in step S845, it is determined whether a pattern image is required for recognizing the user interaction. If so, in step S847, a frame of the input image signal is replaced by a pattern frame at the frame time, and the image signal with the pattern frame is provided to the projector module 200. Then, the image of the image signal is projected onto the target 900 by the projector module 200 in synchronization with the frame replacement unit 100. The method returns to step S803 for repeatedly performing the above processes.
As shown in
The projector module 1200, which may be implemented with a projector, projects an image of an input image signal onto the target 1900. The pattern image generator 1120, which may include a laser projection module and a diffraction grating, generates a laser beam having a pattern passing through the diffraction grating to project a pattern image at a frame time onto the target 1900. The pattern image generator 1120 may generate a pattern image of various types depending on the diffraction grating as well as the pattern image of the grid pattern as in the first embodiment.
The image acquisition unit 1500, which may be implemented with an IR camera, captures the pattern image projected onto the target 1900 on which the subject image is projected, in synchronization with the frame time. The pattern image captured by the image acquisition unit 1500 is provided to the image recognition unit 1700.
The image recognition unit 1700 recognizes the user interaction from the pattern image captured by the image acquisition unit 1500, such as a motion event of the user's finger or of a tool. Further, the image recognition unit 1700 detects brightness, color, distortion and focus of the pattern image. The detected brightness, color, and distortion of the pattern image are provided to the image correction unit 1110 and the detected focus is provided to the optical controller 1400.
The image correction unit 1110 corrects the image to be projected onto the target 1900 depending on the detected brightness, color, and the optical controller 1400 controls an optical system in the projector module 1200 to correct the focus of the image to be projected onto the target 1900.
As described above, in accordance with the embodiment of the present invention, the user can interact with a bare hand and low computation is needed for recognition of the interaction, thus consuming a short time to process an interaction and accordingly offering a fast response time.
In addition, the use of a pattern image enables it to achieve high recognition performance with respect to skin color and surrounding light.
Also, the brightness, color, and focus of the projector can be corrected fast and the projected image can be quickly and precisely matched to a particular space such a screen without any distortion.
The present invention as described above is applicable to a mobile device equipped with a projector and a camera, as well as to a projector system such as a projection computer. In particular, the present invention is even more applicable in a mobile device or wearable system subject to severe changes of surrounding environments, such as peripheral light amount, lighting, wobbling, etc, or in a small-sized embedded system requiring a low computation technique.
While the invention has been shown and described with respect to the particular embodiments, it will be understood by those skilled in the art that various changes and modification may be made without departing the scope of the present invention as defined in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0034644 | Apr 2010 | KR | national |
10-2010-0132490 | Dec 2010 | KR | national |