This utility application claims priority to Taiwan Application Serial Number 099103874, filed Feb. 9, 2010, and Taiwan Application Serial Number 099126731, filed Aug. 11, 2010, which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an object-detecting system and method, and more particularly, to an object-detecting system and method by use of non-coincident fields of light and single-line image sensor.
2. Description of the Prior Art
Because of the advantages of intuited operation of user by touch to input the coordinates corresponding to a monitor, the touch screen has been a common input apparatus equipped with the monitor. Touch screen has been widely used in electronic products with a screen, such as monitors, notebook computers, tablet computers, auto teller machine (ATM), Point-of-Sale (POS) terminal, tourist guiding system, and industrial control system.
Except for the traditional resistive touch screen and capacitive touch screen that users have to touch to operate, users can also input the coordinates without touching the screen by using image-capturing device. The prior art related to non-contact touch screen (or called optical touch screen) by using image-capturing unit has been disclosed in U.S. Pat. No. 4,507,557, and discussion of unnecessary details will be hereby omitted.
To analyze the position of input point more precisely or to support multi-touch, certain of design solutions about different types of light source, light-reflecting device and light-guiding device have been proposed to provide more angular functions related to the positions of input points to analyze the positions. For example, U.S. Pat. No. 7,460,110 discloses an apparatus includes a waveguide, mirrors extend along both sides of the waveguide, and light source to form an upper layer and a lower layer of coincident fields of light simultaneously. Accordingly, the image-capturing unit can capture images of the upper layer and the lower layer simultaneously.
However, it is necessary to use expansive image sensor like an area image sensor, a multiple-line image sensor or a double-line image sensor to capture the images of the upper layer and the lower layer simultaneously. Moreover, the optical touch screen needs more operating resource to analyze the image captured by the area image sensor, the multiple-line image sensor and the double-line image sensor, especially the area image sensor. Additionally, the image sensors, especially the double-line image sensor, may sense wrong fields of light or fail to sense the field of light because of the assembly error of the optical touch screen.
Accordingly, an aspect of the present invention is to provide an object-detecting system and method for detecting a target position of an object on an indicating plane by using optical approach. Particularly, the object-detecting system and method of the invention apply non-coincident fields of light and single-line image sensors to solve the problems of the prior art.
Additionally, another aspect of the invention is to provide an object-detecting system and method for detecting information, such as an object shape, an object area, an object stereo-shape and an object volume, of an object in the indicating space including the indicating plane.
Additionally, another aspect of the invention is to provide preferred settings of the operation times of image-capturing units and the exposure times of light-emitting units in the object-detecting system to improve the quality of the captured images.
An object-detecting system, according to the first preferred embodiment of the invention, includes a peripheral member, a light-reflecting device, a controlling/processing unit, a first light-emitting unit, a second light-emitting unit, a third light-emitting unit, and a first image-capturing unit. The peripheral member defines an indicating space and an indicating plane of the indicating space on which an object directs a target position. The peripheral member has a relationship with the object. The indicating space defines a first side, a second side adjacent to the first side, a third side adjacent to the second side, and a fourth side adjacent to the third side and the first side. The third side and the fourth side form a first edge. The light-reflecting device is disposed on the peripheral member and located at the first side. The light-reflecting device is disposed on the peripheral member and located at the first side. The first light-emitting unit is electrically connected to the controlling/processing unit, disposed on the peripheral member, and located at the first side. The first light-emitting unit is controlled by the controlling/processing unit to emit a first light which passes through the indicating space to form a first field of light. The second light-emitting unit is electrically connected to the controlling/processing unit, disposed on the peripheral member, and located at the second side. The third light-emitting unit is electrically connected to the controlling/processing unit, disposed on the peripheral member, and located at the third side. The third light-emitting unit is controlled by the controlling/processing unit to emit a second light which passes through the indicating space to form a second field of light. The second light-emitting unit is controlled by the controlling/processing unit to selectively emit the first light synchronously or asynchronously with the first light-emitting unit, or to selectively emit the second light synchronously or asynchronously with the third light-emitting unit. The first image-capturing unit is electrically connected to the controlling/processing unit, and disposed around of the first edge. The first image-capturing unit defines a first image-capturing point. The first image-capturing unit is controlled by the controlling/processing unit to capture a first image on the first side of the indicating space, and selectively capture a second image on the second side of the indicating space and a first reflected image reflected by the light reflecting device on the second side of the indicating space when the first field of light is formed. The first image-capturing unit is also controlled by the controlling/processing unit to capture a second reflected image reflected by the light reflecting device on the third side of the indicating space and selectively capture a third image on the second side of the indicating space and a third reflected image reflected by the light reflecting device on the second side of the indicating space when the second field of light is formed. Moreover, the controlling/processing unit processes the first image and the second reflected image and selectively processes at least two among the second image, the first reflected image, the third image and the third reflected image to determine an object information of the object located in the indicating space.
In one embodiment, the light-reflecting device is a plane mirror.
In another embodiment, the light-reflecting device includes a first reflecting surface and a second reflecting surface. The first reflecting surface and the second reflecting surface are substantially perpendicular to each other and toward to the indicating space. The indicating plane defines a main extending surface. The first reflecting surface defines a first sub-extending surface, and the second reflecting surface defines a second sub-extending surface. The first sub-extending surface and the second first sub-extending surface meet the main extending surface at an angle of 45°.
In one embodiment, the first image-capturing unit is a line image sensor.
In one embodiment, the controlling/processing unit stores a plurality of operation times. Each of the operation time includes at least one exposure time. Each of the first light-emitting unit, the second light-emitting unit and the third light-emitting unit corresponds to at least one of the exposure times. The controlling/processing unit controls each of the light-emitting units to emit the first light and/or the second light in accordance with the at least one exposure time corresponding to said one light-emitting unit. The controlling/processing unit also controls the first image-capturing unit to capture the images within each of the operation times.
In another embodiment, the object-detecting system according to the first preferred embodiment of the invention further includes a fourth light-emitting unit and a second image-capturing unit. The second side and the third side form a second edge. The fourth light-emitting unit is electrically connected to the controlling/processing unit, disposed on the peripheral member, and located at the fourth side. The fourth light-emitting unit is controlled by the controlling/processing unit to selectively emit the first light synchronously or asynchronously with the first light-emitting unit, or to selectively emit the second light synchronously or asynchronously with the third light-emitting unit. The second image-capturing unit is electrically connected to the controlling/processing unit, and disposed around the second edge. The second image-capturing unit is controlled by the controlling/processing unit to capture a fourth image on the first side of the indicating space, and selectively to capture a fifth image on the fourth side of the indicating space and a fourth reflected image reflected by the light reflecting device on the fourth side of the indicating space when the first field of light is formed. The second image-capturing unit is controlled by the controlling/processing unit to capture a fifth reflected image reflected by the light reflecting device on the third side of the indicating space, and selectively to capture a sixth image on the fourth side of the indicating space and a sixth reflected image reflected by the light reflecting device on the fourth side of the indicating space when the second field of light is formed. The controlling/processing unit processes the first image, the second reflected image, the fourth image and the fifth reflected image and selectively processes at least two among the second image, the first reflected image, the third reflected image, the third image, the fifth image, the fourth reflected image, the sixth image and the sixth reflected image to determine the object information of the object located in the indicating space.
In one embodiment, the second image-capturing unit is a line image sensor.
In one embodiment, each of the first light-emitting unit, the second light-emitting unit, the third light-emitting unit and the fourth light-emitting unit is a line light source.
According to the second preferred embodiment of the invention, the basic elements to perform the object-detecting method of the invention includes a peripheral member, a light-reflecting device, a first light-emitting unit, a second light-emitting unit and a third light-emitting unit. The peripheral member defines an indicating space and an indicating plane of the indicating space on which an object directs a target position. The peripheral member has a relationship with the object. The indicating space defines a first side, a second side adjacent to the first side, a third side adjacent to the second side, and a fourth side adjacent to the third side and the first side. The light-reflecting device is disposed on the peripheral member, and located at the first side. The first light-emitting unit is disposed on the peripheral member, and located at the first side. The second light-emitting unit is disposed on the peripheral member, and located at the second side. The third light-emitting unit is disposed on the peripheral member, and located at the third side. The object-detecting method according to the invention is first to control the first light-emitting unit to emit a first light, and selectively to control the second light-emitting unit to emit the first light synchronously or asynchronously with the first light-emitting unit, where the first light passes through the indicating space to form a first field of light. Then, the object-detecting method according to the invention is to capture a first image on the first side of the indicating space, and selectively to capture a second image on the second side of the indicating space and a first reflected image reflected by the light reflecting device on the second side of the indicating space when the first field of light is formed. Next, the object-detecting method according to the invention is to control the third light-emitting unit to emit a second light, and selectively to control the second light-emitting unit to emit the second light synchronously or asynchronously with the third light-emitting unit, where the second light passes through the indicating space to form a second field of light. Ten, the object-detecting method according to the invention is to capture a second reflected image reflected by the light reflecting device on the third side of the indicating space, and selectively to capture a third image on the second side of the indicating space and a third reflected image reflected by the light reflecting device on the second side of the indicating space when the second field of light is formed. Finally, the object-detecting method according to the invention is to process the first image, the second reflected image, and selectively to process at least two among the second image, the first reflected image, the third reflected image, the third image to determine the object information of the object located in the indicating space.
An object-detecting system according to the third preferred embodiment of the invention is implemented on an indicating plane which an object directs a target position on. The object-detecting system according to the invention includes a first image-capturing unit, a plurality of light-emitting units and a controlling/processing unit. The first image-capturing unit is disposed around a first edge of the indicating plane. The plurality of light-emitting units are disposed around a peripheral of the indicating plane. The controlling/processing unit stores a plurality of operation times. Each of the operation time includes at least one exposure time. Each of the light-emitting units corresponds to at least one of the exposure times. The controlling/processing unit controls each of the light-emitting units to emit light in accordance with the at least one exposure time corresponding to said one light-emitting unit. The controlling/processing unit also controls the first image-capturing unit to capture images relative to the indicating plane within each of the operation times.
In one embodiment, each of the exposure times is less than or equivalent to the operation time that said one exposure time is within.
In one embodiment, all of the operation times are equivalent and do not overlap one another, and all of the exposure times are equivalent.
In one embodiment, all of the operation times are equivalent and do not overlap one another, and at least one of the exposure times is not equivalent to the others.
In one embodiment, at least two of the operation times overlap one another.
In another embodiment, the object-detecting system according to the third embodiment of the invention further includes a peripheral member and a light-reflecting device. The peripheral member defines an indicating space and the indicating plane of the indicating space. The peripheral member has a relationship with the object. The indicating space defines a first side, a second side adjacent to the first side, a third side adjacent to the second side, and a fourth side adjacent to the third side and the first side. The third side and the fourth side form the first edge. The plurality of light-emitting units include a first light-emitting unit, a second light-emitting unit and a third light-emitting unit. The first light-emitting unit is located at the first side. The first light-emitting unit is controlled by the controlling/processing unit to emit a first light which passes through the indicating space to form a first field of light. The second light-emitting unit is located at the second side. The third light-emitting unit is located at the third side. The third light-emitting unit is controlled by the controlling/processing unit to emit a second light which passes through the indicating space to form a second field of light. The second light-emitting unit is controlled by the controlling/processing unit to selectively emit the first light synchronously or asynchronously with the first light-emitting unit, or to selectively emit the second light synchronously or asynchronously with the third light-emitting unit. The first image-capturing unit defines a first image-capturing point. The first image-capturing unit is controlled by the controlling/processing unit to capture a first image on the first side of the indicating space, and selectively capture a second image on the second side of the indicating space and a first reflected image reflected by the light reflecting device on the second side of the indicating space when the first field of light is formed. The first image-capturing unit is also controlled by the controlling/processing unit to capture a second reflected image reflected by the light reflecting device on the third side of the indicating space and selectively capture a third image on the second side of the indicating space and a third reflected image reflected by the light reflecting device on the second side of the indicating space when the second field of light is formed. The controlling/processing unit processes the first image and the second reflected image and selectively processes at least two among the second image, the first reflected image, the third image and the third reflected image to determine an object information of the object located in the indicating space.
According to the fourth preferred embodiment of the invention, the basic elements to perform the object-detecting method of the invention includes an indicating plane, a plurality of light-emitting units and a plurality of operation times. An object directs a target position on the indicating plane. The plurality of light-emitting units are disposed around a peripheral of the indicating plane. The plurality of operation times are provided. Each of the operation time has at least one exposure time. Each of the light-emitting units corresponds to at least one of the exposure times. The object-detecting method according to the invention is first to control each of the light-emitting units to emit light in accordance with the at least one exposure time corresponding to said one light-emitting unit. Finally, the object-detecting method according to the invention is to capture images relative to the indicating plane within each of the operation times.
The advantage and spirit of the invention may be understood by the following recitations together with the appended drawings.
The invention provides an object-detecting system and method for detecting a target position of an object on an indicating plane by using optical approach. Additionally, the object-detecting system and method of the invention can detect information, such as an object shape, an object area, an object stereo-shape and an object volume, of an object in the indicating space including the indicating plane. Particularly, the object-detecting system and method of the invention apply non-coincident fields of light. Therefore, the object-detecting system and method of the invention can be operated with cheaper image sensor and less calculation resource.
The objective of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment, which is illustrated in the various figures and drawings.
Referring to
As shown in
The peripheral member 14 defines an indicating space and an indicating plane 10 of the indicating space on which an object directs the target positions (P1, P2). The peripheral member 14 has a relationship with the object. The indicating plane 10 has a first side 102, a second side 104 adjacent to the first side 102, a third side 106 adjacent to the second side 104, and a fourth side 108 adjacent to the third side 106. The third side 106 and the fourth side 108 form a first edge C1, and the second side 104 and the third side 106 form a second edge C2.
As shown in
As shown in
In one embodiment, the light-reflecting device 13 can be a plane minor.
In another embodiment, as shown in
The first light-emitting unit 122 is controlled by the controlling/processing unit 11 to emit a first light. The first light passes through the indicating space to form a first field of light. The third light-emitting unit 126 is controlled by the controlling/processing unit 11 to emit a second light. The second light passes through the indicating space to form a second field of light. The second light-emitting unit 124 is controlled by the controlling/processing unit 11 to selectively emit the first light synchronously or asynchronously with the first light-emitting unit 122, or to selectively emit the second light synchronously or asynchronously with the third light-emitting unit 126. Particularly, the controlling/processing unit 11 controls the first field of light and the second field of light formed not at the same time.
The first image-capturing unit 16 is controlled by the controlling/processing unit 11 to capture a first image on the first side 102 of the indicating space, and selectively capture a second image on the second side 104 of the indicating space and a first reflected image reflected by the light reflecting device 13 on the second side 104 of the indicating space when the first field of light is formed. The first image-capturing unit 16 is also controlled by the controlling/processing unit 11 to capture a second reflected image reflected by the light reflecting device 13 on the third side 106 of the indicating space and selectively capture a third image on the second side 104 of the indicating space and a third reflected image reflected by the light reflecting device 13 on the second side 104 of the indicating space when the second field of light is formed. These images and reflected image include the obstruction of the object to the first light and the second light in the indicating space, that is, the shadow projected on these images and reflected images.
Finally, the controlling/processing unit 11 processes the first image and the second reflected image and selectively processes at least two among the second image, the first reflected image, the third image and the third reflected image to determine an object information of the object located in the indicating space.
In practice, the first image-capturing unit 16 can be a line image sensor.
In one embodiment, the object information includes a relative position of the target position relating to the indicating plane 10. The controlling/processing unit 11 determines a first object point based on the object on the first side 102 in the first image or the object on the second side 104 in the second image. The controlling/processing unit 11 also determines a first reflective object point according to the object on the second side 104 and the third side 106 in the first reflected image. The controlling/processing unit 11 also determines a first direct path according to the connective relationship between the first image-capturing point and the first object point, and determines a first reflective path according to the connective relationship between the first image-capturing point and the first object point and the light-reflecting device 13. Furthermore, the controlling/processing unit 11 determines the relative position based on the intersection of the first direct path and the first reflective path.
In one embodiment, the object information includes object shape and/or an object area of the object projected on the indicating plane 10. The controlling/processing unit 11 determines a first object point and a second object point according to the object on the first side 102 in the first image or the object on the second side 104 in the second image. The controlling/processing unit 11 also determines a first reflective object point and a second reflective object point according to the object on the second side 104 and the third side 106 in the first reflected image. The controlling/processing unit 11 further determines a first direct planar-path according to the connective relationship between the first object point and the second object point, and determines a first reflective planar-path according to the connective relationship between the first image-capturing point and the first reflective object point, the connective relationship between the first image-capturing point and the second reflective object point, and the first light-reflecting device. Moreover, the controlling/processing unit 11 determines the object shape and/or the object area of the object according to the shape and/or the area of the intersection region of the first direct planar-path and the first reflective planar-path. Furthermore, the object information includes the object stereo-shape and/or the object volume of the object located in the indicating space. The controlling/processing unit 11 also separates the first image, the second image and the first reflected image to a plurality of first sub-images, a plurality of second sub-images and a plurality of first reflective sub-images respectively. The controlling/processing unit 11 further determines a plurality of object shapes and/or a plurality of object areas according to the first sub-images, the second sub-images and the first reflective sub-images, and stacks the object shapes and/or the object areas along a normal direction of the indicating plane 10 to determine the object stereo-shape and/or the object volume of the object.
In one embodiment, the object information includes an object stereo-shape and/or an object volume of the object located in the indicating space. The controlling/processing unit 11 determines at least three object points according to the object on the first side 102 in the first image or the object on the second side 104 in the second image. The controlling/processing unit 11 also determines at least three reflective object points according to the object on the second side 104 and the third side 106 in the first reflected image. The controlling/processing unit 11 further determines a first direct stereo-path according to the connective relationship between the first image-capturing point and the at least three object points, and determines a first reflective stereo-path according to the connective relationship between the first image-capturing point and the at least three reflective object points and the light-reflecting device 13, and determines the object stereo-shape and/or the object volume of the object according to the stereo-shape and/or the volume of the intersection space of the first direct stereo-path and the first reflective stereo-path.
In another embodiment of the present invention, the controlling/processing unit 11 stores a plurality of operation times. Each of the operation time includes at least one exposure time. Each of the first light-emitting unit 122, the second light-emitting unit 124 and the third light-emitting unit 126 corresponds to at least one of the exposure times. The controlling/processing unit 11 controls each of the first light-emitting unit 122, the second light-emitting unit 124 and the third light-emitting unit 126 to emit the first light and/or the second light in accordance with the at least one exposure time corresponding to said one light-emitting unit (122, 124, 126). The controlling/processing unit 11 also controls the first image-capturing unit 16 to capture these images and reflected images within each of the operation times.
In one embodiment, each of the exposure times is less than or equivalent to the operation time that said one exposure time is within. In another embodiment, all of the operation times are equivalent and do not overlap one another, and all of the exposure times are equivalent. In another embodiment, all of the operation times are equivalent and do not overlap one another, and at least one of the exposure times is not equivalent to the others. In another embodiment, at least two of the operation times overlap one another.
As shown in
The second image-capturing unit 18 is electrically connected to the controlling/processing unit 11, and is disposed around the second edge C2. The second image-capturing unit 18 further defines a second image-capturing point. The second image-capturing unit 18 is controlled by the controlling/processing unit 11 to capture a fourth image on the first side 102 of the indicating space, and selectively to capture a fifth image on the fourth side 108 of the indicating space and a fourth reflected image reflected by the light reflecting device 13 on the fourth side 108 of the indicating space when the first field of light is formed. The second image-capturing unit 18 is controlled by the controlling/processing unit 11 to capture a fifth reflected image reflected by the light reflecting device 13 on the third side 106 of the indicating space, and selectively to capture a sixth image on the fourth side 108 of the indicating space and a sixth reflected image reflected by the light reflecting device 13 on the fourth side 108 of the indicating space when the second field of light is formed.
In the preferred embodiment, the controlling/processing unit 11 processes the first image, the second reflected image, the fourth image and the fifth reflected image and selectively processes at least two among the second image, the first reflected image, the third reflected image, the third image, the fifth image, the fourth reflected image, the sixth image and the sixth reflected image to determine the object information of the object located in the indicating space.
In practice, the second image-capturing unit 18 can be a line image sensor.
In practice, each of the first light-emitting unit 122, the second light-emitting unit 124, the third light-emitting unit 126 and the fourth light-emitting unit 128 can be a line light source. Moreover, the line light source (122, 124, 126 and 128) can be formed by a stick light-guiding device and a light-emitting diode (such as an infrared light-emitting diode) disposed on one end of the stick light-guiding. The light emitted by the light-emitting diode to the end of the stick light-guiding device which can guide the light to the indicating plane 10. Furthermore, the line light source (122, 124, 126 and 128) can be a series of light-emitting diodes.
In practice, the background values of these reflected images are weak, so that the determination of the shadows mirror projected in these reflected images would be affected. To solve the problem, the controlling/processing unit 11 can turn on the second light-emitting unit 124, the third light-emitting unit 126, the fourth light-emitting unit 128, the first image-capturing unit 16 and the second image-capturing unit 18 longer or twice to let the time of exposure of these reflected images be longer than those of other images. Moreover, we can make the quantity of illumination of the second field of light higher than that of the first field of light by controlling the gain value of the first light-emitting unit 122, the second light-emitting unit 124, the third light-emitting unit 126 and the fourth light-emitting unit 128; the driving electric current of the light-emitting diode; or the number of ignited light-emitting diode.
To improve the quality of the captured images, in one embodiment, the controlling/processing unit 11 stores a plurality of operation times. Each of the operation times includes at least one exposure time. The first image-capturing unit 16 and the second image-capturing unit 18 respectively correspond to at least one of the operation times. Each of the first light-emitting unit 122, the second light-emitting unit 124, the third light-emitting unit 126 and the fourth light-emitting unit 128 corresponds to at least one of the exposure times. The controlling/processing unit 11 controls each of the first light-emitting unit 122, the second light-emitting unit 124, the third light-emitting unit 126 and the fourth light-emitting unit 128 to emit the first light and/or the second light in accordance with the at least one exposure time corresponding to said one light-emitting unit (122, 124, 126, 128). The controlling/processing unit 11 also controls the first image-capturing unit 16 to capture the images within each of the operation times corresponding to the first image-capturing unit 16, and controls the second image-capturing unit 18 to capture the images within each of the operation times corresponding to the second image-capturing unit 18.
The situation of the object-detecting system 1 of the invention forms the fields of light and the images captured by the system are described with an example of two input points (P1, P2) in the indicating plane 10 of
As shown in
As shown in
As shown in
Obviously, the object-detecting system 1 of the invention can preciously calculate the locations of the input points P1 and P2 of
Please refer to
As shown in
Then, the object-detecting method 2 according to the invention performs step S22 to capture a first image on the first side of the indicating space, and selectively to capture a second image on the second side of the indicating space and a first reflected image reflected by the light reflecting device on the second side of the indicating space when the first field of light is formed.
Afterward, the object-detecting method 2 according to the invention performs step S24 to control the third light-emitting unit to emit a second light, and selectively to control the second light-emitting unit to emit the second light synchronously or asynchronously with the third light-emitting unit, where the second light passes through the indicating space to form a second field of light.
Then, the object-detecting method 2 according to the invention performs step S26 to capture a second reflected image reflected by the light reflecting device on the third side of the indicating space, and selectively to capture a third image on the second side of the indicating space and a third reflected image reflected by the light reflecting device on the second side of the indicating space when the second field of light is formed.
Finally, the object-detecting method 2 according to the invention performs step S28 to process the first image, the second reflected image, and selectively to process at least two among the second image, the first reflected image, the third reflected image, the third image to determine the object information of the object located in the indicating space. The availability and determination regarding the object information are as described above, and discussion of unnecessary details will be hereby omitted.
To improve the quality of the captured images the object-detecting method 2, in one embodiment, a plurality of operation times are provided. Each of the operation times includes at least one exposure time. Each of the first light-emitting unit, the second light-emitting unit and the third light-emitting unit corresponds to at least one of the exposure times. Step S20 and step S24 are performed further to control each of the light-emitting units to emit the first light and/or the second light in accordance with the at least one exposure time corresponding to said one light-emitting unit. Moreover, step S22 and step S26 are performed further to capture the images within each of the operation times. The settings of the operation times and the exposure times are as described above, and discussion of unnecessary details will be hereby omitted.
To enhance the accuracy of the object-detecting method 2, in one embodiment, a fourth light-emitting unit is located at the fourth side. Step S20 is performed further to selectively control the fourth light-emitting unit to emit the first light synchronously or asynchronously with the first light-emitting unit. Step S22 is performed further to capture a fourth image on the first side of the indicating space and selectively capture a fifth image on the fourth side of the indicating space and a fourth reflected image reflected by the light-reflecting device on the fourth side of the indicating space. Step S24 is performed further to selectively control the fourth light-emitting unit to emit the second light synchronously or asynchronously with the third light-emitting unit. Step S26 is performed further to capture a fifth reflected image of portion reflected by the light-reflecting device on the third side of the indicating space and selectively capture a sixth image on the fourth side of the indicating space and a sixth reflected image reflected by the light-reflecting device on fourth side of the indicating space. Step S28 is performed further to process the first image, the second reflected image, the fourth image and the fifth reflected image and selectively process at least two among the second image, the first reflected image, the third reflected image, the third image, the fifth image, the fourth reflected image, the sixth image and the sixth reflected image to determine the object information of the object located in the indicating space. Furthermore, to improve the quality of the captured images the object-detecting method 2, the operation times and the exposure times mentioned above can be used in the embodiment.
In one embodiment, the first image, the second image, the third image, the first reflected image, the second reflected image and the third reflected image can be captured by a single-line image sensor. Moreover, the fourth image, the fifth image, the sixth image, the fourth reflected image, the fifth reflected image and the sixth reflected image can be captured by another line image sensor.
Please refer to
As shown in
The first image-capturing unit is disposed around a first edge C1 of the indicating plane 10. The plurality of light-emitting units are disposed around a peripheral of the indicating plane 10. The controlling/processing unit 11 stores a plurality of operation times. Each of the operation time comprising at least one exposure time, each of the light-emitting units corresponds to at least one of the exposure times. The controlling/processing unit 11 controls each of the light-emitting units to emit light in accordance with the at least one exposure time corresponding to said one light-emitting unit. The controlling/processing unit 11 also controls the first image-capturing unit 16 to capture images relative to the indicating plane 10 within each of the operation times. The settings of the operation times and the exposure times are as described above, and discussion of unnecessary details will be hereby omitted.
In one embodiment, also as shown in
The plurality of light-emitting units include a first light-emitting unit 122, a second light-emitting unit 124 and a third light-emitting unit 126. The first light-emitting unit 122 is located at the first side 102. The first light-emitting unit 122 is controlled by the controlling/processing unit 11 to emit a first light which passes through the indicating space to form a first field of light. The second light-emitting unit 124 is located at the second side 104. The third light-emitting unit 126 is located at the third side 106. The third light-emitting unit 126 is controlled by the controlling/processing unit 11 to emit a second light which passes through the indicating space to form a second field of light. The second light-emitting unit 124 is controlled by the controlling/processing unit 11 to selectively emit the first light synchronously or asynchronously with the first light-emitting unit 122, or to selectively emit the second light synchronously or asynchronously with the third light-emitting unit 126.
The first image-capturing unit 16 defines a first image-capturing point. The first image-capturing unit 16 is controlled by the controlling/processing unit 11 to capture a first image on the first side 102 of the indicating space, and selectively capture a second image on the second side 104 of the indicating space and a first reflected image reflected by the light reflecting device 13 on the second side 104 of the indicating space when the first field of light is formed. The first image-capturing unit 16 is also controlled by the controlling/processing unit 11 to capture a second reflected image reflected by the light reflecting device 13 on the third side 106 of the indicating space and selectively capture a third image on the second side 104 of the indicating space and a third reflected image reflected by the light reflecting device 13 on the second side 104 of the indicating space when the second field of light is formed. Moreover, the controlling/processing unit 11 processes the first image and the second reflected image and selectively processes at least two among the second image, the first reflected image, the third image and the third reflected image to determine an object information of the object located in the indicating space. The availability and determination regarding the object information are as described above, and discussion of unnecessary details will be hereby omitted.
In one embodiment, also as shown in
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
That is, if it is estimated that the brightness required by the third light-emitting unit 126 and the fourth light-emitting unit 128 is maximum in accordance with the fixed noise of the image captured by the first image-capturing unit 16, the quality of image as desired and other effects, the operation times corresponding to the third light-emitting unit 126 and the fourth light-emitting unit 128 can be set to overlap partially, as shown in
Please refer to
Please refer to
As shown in
In one embodiment, a peripheral member defines an indicating space and the indicating plane of the indicating space. The peripheral member has a relationship with the object. The indicating space defines a first side, a second side adjacent to the first side, a third side adjacent to the second side, and a fourth side adjacent to the third side and the first side. A light-reflecting device is disposed on the peripheral member and located at the first side. The plurality of light-emitting units includes a first light-emitting unit, a second light-emitting unit and a third light-emitting unit. The first light-emitting unit is located at the first side. The second light-emitting unit is located at the second side. The third light-emitting unit is located at the third side. Step S30 in the object-detecting method 3 is performed by the steps of: (S30a) according to the at least one exposure time corresponding to the first light-emitting unit and the second light-emitting unit, controlling the first light-emitting unit to emit a first light, and selectively controlling the second light-emitting unit to emit the first light synchronously or asynchronously with the first light-emitting unit, where the first light passes through the indicating space to form a first field of light; and (S30b) according to the at least one exposure time corresponding to the third light-emitting unit and the second light-emitting unit, controlling the third light-emitting unit to emit a second light, and selectively controlling the second light-emitting unit to emit the second light synchronously or asynchronously with the third light-emitting unit, where the second light passes through the indicating space to form a second field of light. Moreover, step S32 is performed by the steps of: (S32a) when the first field of light is formed, capturing within each of the operation times a first image on the first side of the indicating space and selectively to capture a second image on the second side of the indicating space and a first reflected image reflected by the light reflecting device on the second side of the indicating space; and (32b) when the second field of light is formed, capturing within each of the operation times a second reflected image reflected by the light reflecting device on the third side of the indicating space and selectively to capture a third image on the second side of the indicating space and a third reflected image reflected by the light reflecting device on second side of the indicating space. The object-detecting method 3 according to the fourth preferred embodiment of the invention further includes the step of processing the first image, the second reflected image, the second reflected image and selectively processing at least two among the second image, the first reflected image, the third reflected image, the third image to determine the object information of the object located in the indicating space.
In one embodiment, the plurality of light-emitting units further includes a fourth light-emitting unit located at the fourth side. Step S30a is performed further to selectively control the fourth light-emitting unit to emit the first light synchronously or asynchronously with the first light-emitting unit. Step S32b is performed further to capture a fourth image on the first side of the indicating space and selectively capture a fifth image on the fourth side the indicating space and a fourth reflected image reflected by the light-reflecting device on the fourth side of the indicating space. Step 30b is performed further to selectively control the fourth light-emitting unit to emit the second light synchronously or asynchronously with the third light-emitting unit. Step 32b is performed further to capture a fifth reflected image of portion reflected by the light-reflecting device on the third side of the indicating space and selectively capture a sixth image r on the fourth side of the indicating space and a sixth reflected image reflected by the light-reflecting device on fourth side of the indicating space. The determination of the object information of the object is performed further to process the first image, the second reflected image, the fourth image and the fifth reflected image and selectively process at least two among the second image, the first reflected image, the third reflected image, the third image, the fifth image, the fourth reflected image, the sixth image and the sixth reflected image to determine the object information of the object located in the indicating space. Embodiments of the first light-emitting unit, the second light-emitting unit, the third light-emitting unit, the fourth light-emitting unit and the light-reflecting device are as shown and illustrated in
With the example and explanations above, the features and spirits of the invention will be hopefully well described. Those skilled in the art will readily observe that numerous modifications and alterations of the device may be made while retaining the teaching of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
099103874 | Feb 2010 | TW | national |
099126731 | Aug 2010 | TW | national |