1. Field of the Disclosure
The disclosure relates to a mobile device and a method for controlling a graphical user interface (GUI) of the mobile device.
2. Description of the Related Art
In the recent years, smart computing devices have evolved gradually from desktop computers to mobile devices. Some of the latest smart mobile devices are designed as wearable devices and they are getting more and more diverse, such as augmented reality (AR) glasses, smart watches, control bracelets, and control rings, etc.
The smart wearable devices interact with their users through cameras, touch sensors, voice sensors, or motion sensors. The purpose of these devices is providing more means for their users to complete specific tasks. One of the factors that can affect the convenience and efficiency of smart mobile devices is the user interface.
This disclosure provides a mobile device and a method for controlling a GUI of the mobile device. The mobile device is a wearable device comprising an easy, intuitive and interactive GUI and is capable of conveying real sense of touches to the user.
In an embodiment of the disclosure, a mobile device is provided, which comprises a camera unit, a sensor unit, a see-through display, and a processor. The camera unit takes an image of a finger and a surface. The sensor unit generates a sensor signal in response to a motion of the finger. The taking of the image and the generation of the sensor signal are synchronous. The see-through display displays a GUI on the surface. The processor is coupled to the camera unit, the sensor unit, and the see-through display. The processor uses both of the image and the sensor signal to detect a touch of the finger on the surface. The processor adjusts the GUI in response to the touch.
In an embodiment of the disclosure, a method for controlling a GUI of a mobile device is provided. The method comprises the following steps: taking an image of a finger and a surface; generating a sensor signal in response to a motion of the finger, wherein the taking of the image and the generation of the sensor signal are synchronous; displaying the GUI on the surface; using both of the image and the sensor signal to detect a touch of the finger on the surface; and adjusting the GUI of the mobile device in response to the touch.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
Below, embodiments will be described in detail with reference to accompanying drawings so as to be easily realized by a person having ordinary knowledge in the art. The inventive concept may be embodied in various forms without being limited to the embodiments set forth herein. Descriptions of well-known parts are omitted for clarity, and like reference numerals refer to like elements throughout.
The mobile device 100 may be designed as a head-mounted device, such as a pair of glasses. The display 130 may be a see-through display disposed as the lenses of the glasses. The camera unit 110 may take images of the environment observed by the user of the mobile device 100. When the processor 140 identifies a surface in the image, such as a desktop, a wall, or a palm of the user, the display 130 may project a virtual GUI of the mobile device 100 into the eyes of the user by refraction. The virtual GUI is not projected on the surface. The user still can see the physical surface through the GUI and the display 130. Consequently, what the user sees is the physical surface overlaid with the virtual GUI.
The user may touch the surface with a fingertip to interact with the mobile device 100. The user feels like he or she is touching the GUI displayed by the display 130. The physical surface provides real sense of touch to the user. The processor 140 detects the position of the touch by identifying and tracking the fingertip in the image taken by the camera unit 110. The sensor unit 120 may detect physical phenomenon such as sound, pressure or vibration caused by the fingertip of the user touching the surface, so that the processor 140 can check whether the touch occurs or not by analyzing the sensor signal generated by the sensor unit 120.
The hand used by the user to touch the surface is defined as the index hand, while the hand whose palm serves as the surface is defined as the reference hand.
The sensor unit 120 may comprise one or more sensors. Each sensor may be disposed at one of three positions. The first position is the main body of the mobile device 100, such as the aforementioned glasses. The second position is on the index hand, such as the position 225 in
The sensor unit 120 may comprise an electromyography (EMG) sensor, a vibration sensor, a pressure sensor, a microphone, a gyroscope sensor, an accelerometer, or any combination of the aforementioned sensors. The EMG sensor generates electromyographic signals as the sensor signal in response to contractions of muscles of the user. The vibration sensor generates the sensor signal in response to vibrations caused by the touch. The pressure sensor generates the sensor signal in response to pressures caused by the touch. The pressure sensor may need to be disposed on the palm or the fingertip to detect the pressures. The microphone generates the sensor signal in response to sounds caused by the touch. The gyroscope sensor and the accelerometer may generate the sensor signal in response to motions of the finger, the index hand, or the reference hand.
The mobile device 100 may use various techniques to eliminate interferences of noises in either the image or the sensor signal in order to prevent false touch events. For example, the camera unit 110 may comprise an infrared (IR) light source and an IR camera. The IR light source may provide IR lighting on the finger and the IR camera may take IR images of the finger. Since the finger is much closer to the IR camera than the background is and therefore the finger is much clearer in the IR images than the background is, it is easy to filter out the background noises in the IR images.
The image taken by the camera unit 110 is prone to uncertainty of the occurrence of the touch because the index hand and the reference hand in the image are usually arranged like those two hands in
The camera unit 110 may comprise one or more time-of-flight (TOF) cameras, one or more dual color cameras, one or more structure light ranging cameras, one or more IR cameras and one or more IR light sources, or any combinations of the aforementioned cameras and light sources. When the camera unit 110 comprises two cameras, the two cameras may take stereo images of the finger and the surface. The processor 140 may use the stereo images to estimate the distance between the finger and the surface. The processor 140 may use the distance as an auxiliary to detect the touch of the finger of the user on the surface.
As mentioned previously, the sensor unit 120 may comprise a gyroscope sensor and/or an accelerometer configured to detect motions of the finger. The processor 140 may detect the touch according to the image and one or more changes of the motion of the finger. When the fingertip touches the surface, the motion of the finger may stop suddenly or slow down suddenly, or the finger may vibrate. The processor 140 may detect the touch by analyzing the sensor signal generated by the gyroscope sensor and/or the accelerometer to detect those changes of the motion of the finger.
In step 325, the processor 140 checks whether the finger of the user can be identified in the image or not. When the processor 140 can identify the finger in the image, the processor 140 uses both of the image and the sensor signal to detect a touch of the finger on the surface in step 330. In step 335, the processor 140 checks whether the occurrence of the touch is confirmed or not. When the touch is confirmed, the processor 140 adjusts the GUI in response to the touch in step 340 to provide visual feedback of the touch to the user.
When processor 140 fails to identify the finger in the image in step 325, the processor 140 ignores the sensor signal and does not attempt to detect the touch and the flow proceeds to step 345. In step 345, the processor 140 uses the image to check whether a preset gesture of the palm of the reference hand can be identified in the image or not. When the processor 140 identifies the preset gesture of the palm in the image, the processor 140 may perform a preset function in response to the preset gesture in step 350.
The processor 140 may identify and track the palm in the image according to characteristics of the palm such as color, feature and shape. The processor 140 may identifies the preset gesture of the palm according to at least one of change of shape of the palm in the image, change of position of the palm in the image, and change of motion of the palm in the image. The position of the palm may comprise the distance from the palm to the camera unit 110 or the surface when the camera unit 110 can take stereo images.
The processor 140 may use both of the image and the sensor signal to identify the preset gesture of the palm in the image. For example, when the sensor unit 120 comprises an EMG sensor, the sensor signal generated by the EMG sensor may indicate contractions of finger muscles resulting from change of gesture of the palm. When the sensor unit 120 comprises a vibration sensor, the sensor signal generated by the vibration sensor may indicate vibrations resulting from change of gesture of the palm. In addition to identify the gesture of the palm in the image, the processor 140 may analyze the magnitude and/or the waveform of the sensor signal to identify the gesture of the palm.
In step 420, the processor 140 checks whether or not the fingertip is identified in the image and the fingertip is in a touch-operable area of the GUI. When processor 140 fails to identify the fingertip or when the fingertip is in a non-touch-operable area of the GUI, the processor 140 ignores the sensor signal and does not attempt to detect the touch. In this case, the processor 140 determines that there is no touch event in step 460. When processor 140 identifies the fingertip and the fingertip is in a touch-operable area of the GUI, the processor 140 analyzes the sensor signal to detect the touch in step 430.
The processor 140 may detect the touch according to the magnitude of the sensor signal. In this case, the processor 140 checks whether the magnitude of the sensor signal is larger than a preset threshold or not in step 440. When the magnitude of the sensor signal is larger than the preset threshold, the processor 140 confirms the touch in step 450. Otherwise, the processor 140 determines that there is no touch event in step 460.
Temperature and humidity affect the elasticity of human skin. The elasticity of human skin affects the vibration detected by the vibration sensor in the sensor unit 120. In an embodiment, the sensor unit 120 may comprise a temperature sensor configure to measure the ambient temperature of the finger and the sensor unit 120 may further comprise a humidity sensor configure to measure the ambient humidity of the finger. The processor 140 may adjust the preset threshold according to the ambient temperature and the ambient humidity to improve the accuracy of the touch detection.
Alternatively, the processor 140 may detect the touch according to the waveform of the sensor signal. In this case, the processor 140 checks whether the waveform of the sensor signal matches a preset waveform or not in step 440. When the waveform of the sensor signal matches the preset waveform, the processor 140 confirms the touch in step 450. Otherwise, the processor 140 determines that there is no touch event in step 460.
A multiple click of the finger, such as a double click or a triple click, on the surface is difficult to detect based on the image alone. On the other hand, the detection of a multiple click is easy based on both of the image and the sensor signal. The processor 140 may detect a multiple click on the surface by analyzing the image to confirm the position of the multiple clicks and analyzing the sensor signal to confirm the number of clicks of the finger on the surface, and then the processor 140 may perform a preset function in response to the multiple clicks. The processor 140 may analyze the sensor signal to detect the multiple clicks only when the fingertip is in a touch-operable area of the GUI to filter out unwanted noises in the sensor signal.
The user may trigger a preset function by performing a preset motion on the surface, such as a straight slide on the surface or plotting the letter āsā on the surface. For example,
Such a preset motion on the surface is difficult to detect based on the image alone because it is difficult to confirm the continued contact of the finger on the surface. On the other hand, the detection of the preset motion on the surface is easy based on both of the image and the sensor signal because the sensor unit 120 can easily detect the continued contact. The processor 140 may detect a preset motion of the finger on the surface by tracking the motion of the finger in the image and analyzing the sensor signal to confirm the continued contact of the finger on the surface, and then the processor 140 may perform a preset function in response to the preset motion.
For example, the mobile device 100 may record photographs and profiles of friends of the user. The camera unit 110 may take an image of a person and the processor 140 may compare the image with the photographs of the friends to identify the person. The user may select a file and perform the preset motion. The preset function associated with the preset motion may be sending the file to a mobile device owned by the person or an account of the person on a remote server.
The following figures from
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the true scope of the disclosure is indicated by the following claims and their equivalents.
This application claims the priority benefits of U.S. provisional application Ser. No. 61/839,881, filed on Jun. 27, 2013. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
Number | Date | Country | |
---|---|---|---|
61839881 | Jun 2013 | US |