This application claims the priority benefit of Taiwan Patent Application Serial Number 101106481, filed on Feb. 29, 2012, the full disclosure of which is incorporated herein by reference.
1. Field of the Disclosure
This disclosure generally relates to a human-machine interface device and, more particularly, to an optical touch device and a detection method capable of detecting a hovering object and a contact object.
2. Description of the Related Art
It is able to control a touch system without using a separated peripheral device such that the touch system has an excellent operational convenience and can be applied to various human-machine interface devices. An optical touch system generally employs an optical sensor configured to detect reflected light from a finger to accordingly identify a position or a gesture of the finger.
For example
According, the present disclosure further provides an optical touch device and a detection method thereof that can detect both a hovering object and a contact object and can eliminate the interference from ambient light.
It is an object of the present disclosure to provide an optical touch device and detection method thereof configured to detect an operating state of an object.
It is another object of the present disclosure to provide an optical touch device and detection method thereof that can eliminate the interference from ambient light.
The present disclosure provides an optical touch device including a light source, a light control unit, a light guide, an image sensor and a processing unit. The light control unit controls the light source to illuminate in different brightness values. The light guide has an incident surface, a touch surface and an ejection surface, wherein the light source emits incident light into the light guide through the incident surface and a plurality of microstructures are formed inside of and/or on the ejection surface of the light guide configured to disperse the incident light toward the touch surface to become dispersed light. The image sensor receives reflected light ejecting from the ejection surface to generate image frames corresponding to the different brightness values of the light source. The processing unit is configured to calculate a differential image of the image frames and identify an operating state according to the differential image.
The present disclosure further provides a detection method of an optical touch device. The optical touch device includes a light source, a light guide, an image sensor and a processing unit, wherein the light guide has an incident surface, a touch surface and an ejection surface, and a plurality of microstructures are formed inside of and/or on the ejection surface of the light guide. The detection method includes the steps of: using the light source to illuminate in a first brightness value and a second brightness value; using the image sensor to capture, at a sampling frequency, reflected light formed by incident light emitted into the light guide by the light source and then dispersed toward the touch surface by the microstructures and then ejecting from the ejection surface so as to generate a first image frame corresponding to the first brightness value and a second image frame corresponding to the second brightness value; using the processing unit to calculate a differential image of the first image frame and the second image frame; and using the processing unit to identify an operating state according to a comparison result of comparing the differential image with two thresholds.
The present disclosure further provides a detection method of an optical navigation device. The optical touch device includes a light source, a light guide, an image sensor and a processing unit, wherein the light guide has an incident surface, a touch surface and an ejection surface, and a plurality of microstructures are formed inside of and/or on the ejection surface of the light guide. The detection method includes the steps of: using the light source to illuminate in a first brightness value, a second brightness value and a third brightness value; using the image sensor to capture, at a sampling frequency, reflected light formed by incident light emitted into the light guide by the light source and then dispersed toward the touch surface by the microstructures and then ejecting from the ejection surface reflected by at least one object in front of the touch surface so as to generate a first image frame corresponding to the first brightness value, a second image frame corresponding to the second brightness value and a third image frame corresponding to the third brightness value; using the processing unit to calculate a first differential image of the first image frame and the third image frame and a second differential image of the second image frame and the third image frame; and using the processing unit to identify an operating state according to comparison results of comparing the first differential image and the second differential image with at least one threshold.
In the optical touch device of the present disclosure, the microstructures are formed on an opposite surface of the touch surface and/or inside the light guide rather than formed on the touch surface to be configured to disperse the incident light toward the touch surface to become dispersed light, wherein the microstructures may have any shape and may be convexes, irregularities or concaves formed by printing, spraying, etching, atomising or pressing process without any limitation. In other words, the light guide is non-zero order designed so as to form a dispersing light field decaying rapidly with distance in front of the touch surface. When an object enters the dispersing light field, the object can reflect reflected light toward the light guide to allow the image sensor in front of the ejection surface to detect the reflected light.
In the optical touch device of the present disclosure, it is to identify a hovering object or a contact object according to the differential image captured by the image sensor such that the interference from ambient light can be effectively eliminated thereby increasing the identification accuracy.
In the optical touch device and the detection method of the present disclosure, the reflected light ejecting from the ejection surface is formed by an object in front of the touch surface, which is approaching or touching the touch surface, reflecting the dispersed light dispersed by the microstructures to pass through the light guide.
Other objects, advantages, and novel features of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
It should be noted that, wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
Referring to
The optical touch device 1 of the present embodiment includes a light source 11, a light control unit 12, a light guide 13, an image sensor 14, a processing unit 15 and a transmission interface 16, wherein the light control unit 12 may be included in the processing unit 15 or independent from the processing unit 15 without any limitation.
The light source 1 is preferably a light emitting diode configured to emit infrared light, red light or other invisible light. The light source 11 is configured to emit incident light 111 into the light guide 13 through an incident surface 131 of the light guide 13 and the incident light 111 propagates away from the incident surface 131 inside the light guide 13. In other words, the light source 11 is disposed opposite to the incident surface 131.
The light control unit 13 is configured to control the light source 11 to illuminate in different brightness values. The purpose of controlling the light source 11 to illuminate in different brightness values is to eliminate the interference from ambient light by calculating a differential image in the post-processing (described later). The light control unit 12 is controlled by the processing unit 15 to allow the lighting of the light source 11 to synchronize to the image capturing of the image sensor 14 as shown in
The light guide 13 may be made of materials transparent to the light emitted by the light source 11, e.g. glass or plastic, but not limited thereto. The light guide 13 has the incident surface 131, a touch surface 132 and an ejection surface 133, wherein the touch surface 132 and the ejection surface 133 are opposite to each other. An object 2 is operated in front of the touch surface 132, wherein operable functions are similar to those of a general optical touch device and thus details thereof are not described herein. In this embodiment, a plurality of microstructures 134 are formed on the ejection surface 133 (e.g. inner surface or exterior surface) of the light guide 13 (as shown in
The image sensor 14 may be a CCD image sensor, a CMOS image sensor or other sensors configured to sense optical energy, and the image sensor 14 is disposed at a side of the ejection surface 133 and configured to receive and capture the reflected light 113 ejecting from the ejection surface 133 at a sampling frequency and to generate image frames corresponding to the different brightness values of the light source 11 (described later), and the image frames are transmitted to the processing unit 15 for post-processing.
The processing unit 15 calculates a differential image of the image frames and identifies the operating state according to the differential image.
The transmission interface 16 is configured to wired or wirelessly transmit the operating state to a related electronic device for corresponding control.
Referring to
Step S31: The light control unit 12 controls the light source 11 to illuminate alternatively in a first brightness value (e.g. rectangles having a longer length) and a second brightness value (e.g. rectangles having a shorter length) as shown in
Step S32: The image sensor 14 captures, at a fixed sampling frequency, reflected light 113 formed by incident light 111 emitted into the light guide 13 by the light source 11 through the incident surface 131 and then dispersed toward the touch surface 132 by the microstructures 134, 134′ to eject from the touch surface 132 and then reflected by the object 2, 2′ to pass through the light guide 13 and to eject from the ejection surface 133 so as to generate a first image frame f1 corresponding to the first brightness value and a second image frame f2 corresponding to the second brightness value, and the image frames are sent to the processing unit 15 for post-processing, wherein as the first brightness value is larger than the second brightness value, an average intensity of the first image frame f1 is larger than that of the second image frame f2.
Step S33: The processing unit 15 then calculates a differential image (f1-f2) of the first image frame f1 and the second image frame f2. As each of the image frames f1 and f2 captured by the image sensor 14 contains ambient light, the interference from the ambient light can be effectively eliminated by calculating the differential image (f1-f2).
Step S34: The processing unit 15 identifies whether the object is in a hovering state or a contact state according to a comparison result of comparing the differential image (f1-f2) with a first threshold (e.g. a hovering threshold Th1) and a second threshold (e.g. a contact threshold Th2) as shown in
For example in one embodiment, when the pixel intensity of a part of pixel area or a maximum pixel intensity of a differential image (f1-f2) is larger than the first threshold Th1 and smaller than the second threshold Th2, it means that the object 2 already can be illuminated by the dispersed light 112 but is not in contact with the touch surface 132 and thus the object 2 is identified in a hovering state. When the pixel intensity of a part of image area or a maximum pixel intensity of a differential image (f1-f2)′ is larger than the second threshold Th2, it means that the object 2′ is in contact with the touch surface 132 and reflects a large amount of the dispersed light 112 and thus the object 2′ is identified in a contact state. When the pixel intensity of all pixels or a maximum pixel intensity of a differential image (f1-f2)″ is smaller than the first threshold Th1, it means that the object is neither in a hovering state nor in a contact state. In this embodiment, a part of the differential image (f1-f2) is compared with two thresholds.
In another embodiment, when an average pixel intensity of a differential image (f1-f2) is larger than the first threshold Th1 and is smaller than the second threshold Th2, the object 2 is identified in a hovering state. When an average pixel intensity of a differential image (f1-f2)′ is larger than the second threshold Th2, the object 2′ is identified in a contact state. In this embodiment, the whole (i.e. average intensity) of the differential image (f1-f2) is compared with two thresholds.
Referring to
Step S41: The light control unit 12 controls the light source 11 to illuminate alternatively in a first brightness value (e.g. rectangles having the longest length) and a second brightness value (e.g. rectangles having the second longest length) and a third brightness value (e.g. rectangles having the shortest length) as shown in
Step S42: The image sensor 14 captures, at a fixed sampling frequency, reflected light 113 formed by incident light 111 emitted into the light guide 13 by the light source 11 through the incident surface 131 and then dispersed toward the touch surface 132 by the microstructures 134, 134′ to eject from the touch surface 132 and then reflected by the object 2, 2′ to pass through the light guide 13 and to eject from the ejection surface 133 so as to generate a first image frame f1 corresponding to the first brightness value, a second image frame f2 corresponding to the second brightness value and a third image frame f3 corresponding to the third brightness value, and the image frames are sent to the processing unit 15 for post-processing, wherein as the first brightness value is larger than the second brightness value and the second brightness value is larger than the third brightness value, an average intensity of the first image frame f1 is larger than that of the second image frame f2 and an average intensity of the second image frame f2 is larger than that of the third image frame f3.
Step S43: The processing unit 15 then calculates a first differential image (f1-f3) of the first image frame f1 and the third image frame f3 and calculates a second differential image (f2-f3) of the second image frame f2 and the third image frame f3, and this step is also configured to eliminate the interference from ambient light.
Step S44: The processing unit 15 identifies whether the object is in a hovering state or a contact state according to comparison results of comparing the first differential image (f1-f3) and the second differential image (f2-f3) with at least one threshold as shown in
For example in one embodiment, when the pixel intensity of a part of image area or a maximum pixel intensity of the first differential image (f1-f3) is larger than a first threshold TH1 and the pixel intensity of all pixels or a maximum pixel intensity of the second differential image (f2-f3) is smaller than a second threshold TH2, it means that the object 2 can be illuminated by a stronger dispersed light 112 but can not be illuminated by a weaker dispersed light 112 and thus the object 2 is identified in a hovering state (as shown in the left part of
In another embodiment, when an average pixel intensity of the first differential image (f1-f3) is larger than a first threshold TH1 and an average pixel intensity of the second differential image (f2-f3) is smaller than a second threshold TH2, the object 2 is identified in a hovering state. When the average pixel intensity of the second differential image (f2-f3) is larger than the second threshold TH2, the object 2′ is identified in a contact state, wherein the first threshold TH1 may be identical to or different from the second threshold TH2. In other words, in this embodiment the whole (i.e. average intensity) of the first differential image (f1-f3) and the second differential image (f2-f3) is compared with a same threshold or compared with different thresholds respectively.
In another embodiment, a differential image may be denoised at first, e.g. filtering the differential image to become a filtered differential image to reduce the interference from noise (e.g. using a low-pass filter) and the interference from ambient light (e.g. using a high-pass filter), and then a filtered maximum pixel intensity and/or a filtered average pixel intensity of the filtered differential image is compared with at least one threshold so as to identify an operating state. For example in
In other words, in every embodiment of the present disclosure, a characteristic value of the differential image is compared with at least one threshold so as to identify the operating state, wherein the characteristic value may be a maximum pixel intensity, an average pixel intensity, a maximum pixel intensity of a filtered differential image and/or an average pixel intensity of a filtered differential image, but not limited thereto.
As mentioned above, in the light guide of a conventional optical touch device, dispersing structures are formed on a touch surface to frustrate total internal reflection of the touch surface such that light can eject from the touch surface. The present disclosure further provides an optical touch device (
Although the disclosure has been explained in relation to its preferred embodiment, it is not used to limit the disclosure. It is to be understood that many other possible modifications and variations can be made by those skilled in the art without departing from the spirit and scope of the disclosure as hereinafter claimed.
Number | Date | Country | Kind |
---|---|---|---|
101106481 | Feb 2012 | TW | national |