1. Technical Field
The present disclosure relates to an optical navigation sensor, in particular, to an optical navigation sensor with an edge detection function, an electronic device with the optical navigation sensor and operation method thereof.
2. Description of Related Art
With the development and growth of technology, more and more electronic devices have an optical navigation function. These electronic devices include an optical navigation sensor to implement the optical navigation function. Besides of optical mousse, the optical navigation sensors widely applied to other electronic devices, for example, a volume control knob of a sound system.
The optical navigation sensor provides a light beam to a surface of an object through a light emitting diode, and captures images based upon a reflected light which the surface of the object reflects the light beam. Then, the optical navigation sensor compares the image which is captured currently with the image which is captured previously, and calculates an amount of displacement.
However, a conventional optical navigation sensor has a problem: if a pixel array of the optical navigation sensor can not accurately sense the images associated with the surface of the object, the amount of displacement calculated by the optical navigation sensor is not equal to an actual amount of displacement. Hence, how to improve accuracy when the optical navigation sensor calculates the amount of displacement is a problem within a technical field.
An exemplary embodiment of the present disclosure provides an optical navigation sensor. The optical navigation sensor is configured for operatively sensing a surface of a rotation unit which the surface alternately disposes at least one recognition block. The optical navigation sensor comprises a pixel array, a navigation unit and an edge detection unit. The navigation unit is coupled to the pixel array. The edge detection unit is coupled to the pixel array and the navigation unit. The pixel array is configured for operatively capturing an image once every capturing interval. The navigation unit is configured for operatively generating a navigation signal according to the images. The navigation signal comprises a rotation direction of the rotation unit. The edge detection unit is configured for operatively generating an edge detection signal according to the images and the navigation signal. The edge detection signal comprises a number of the recognition block which passes a sensing area of the pixel array. When the rotation unit performs a rotation action, the pixel array of the optical navigation sensor starts to capture the image associated with the surface. After receiving at least two images, the navigation unit determines the rotation direction of the rotation unit in response to the images which shows a position variation of the recognition block in the images and generates the navigation signal. The edge detection unit receives the navigation signal and the images, and generates the edge detection signal in response to the rotation direction and the number of the recognition block which passes the sensing area.
An exemplary embodiment of the present disclosure provides an electronic device with an optical navigation function. The electronic device comprises a rotation unit and an optical navigation sensor. The rotation unit comprises a surface. At least one recognition block is alternately disposed on the surface, and light reflection coefficients between the surface and the recognition block are different. The optical navigation sensor is configured for operatively sensing the surface. The optical navigation sensor comprises a pixel array, a navigation unit and an edge detection unit. The navigation unit is coupled to the pixel array. The edge detection unit is coupled to the pixel array and the navigation unit. The pixel array is configured for operatively capturing an image once every capturing interval. The navigation unit is configured for operatively generating a navigation signal according to the images. The navigation signal comprises a rotation direction of the rotation unit. The edge detection unit is configured for operatively generating an edge detection signal according to the images and the navigation signal. The edge detection signal comprises a number of the recognition block which passes a sensing area of the pixel array. When the rotation unit performs a rotation action, the pixel array of the optical navigation sensor starts to capture the image associated with the surface. After receiving at least two images, the navigation unit determines the rotation direction of the rotation unit in response to the images which shows a position variation of the recognition block in the images and generates the navigation signal. The edge detection unit receives the navigation signal and the images, and generates the edge detection signal in response to the rotation direction and the number of the recognition block which passes the sensing area.
An exemplary embodiment of the present disclosure provides an operation method of an electronic device. The electronic device comprises a rotation unit and an optical navigation sensor, and the optical navigation sensor comprising a pixel array, a navigation unit and an edge detection unit. The method comprising the steps of: step (a): at the rotation unit, performing a rotation action. The rotation unit comprises a surface, and at least one recognition block is alternately disposed on the surface, and light reflection coefficients between the surface and the recognition block are different. Step (b): at the pixel array, sensing the surface and capturing an image once every capturing interval. Step (c): at the navigation unit, after receiving at least two images, determining a rotation direction of the rotation unit in response to the images which shows a position variation of the recognition block in the images, and generating a navigation signal. The navigation signal comprises the rotation direction of the rotation unit. Step (d): at the edge detection unit, receiving the navigation signal and the images, and generating an edge detection signal in response to the rotation direction and a number of the recognition block which passes a sensing area of the pixel array. The edge detection signal comprises the number of the recognition block which passes the sensing area. Step (e): determining a rotation state of the rotation unit in response to the navigation signal and the edge detection signal. The rotation state comprises the rotation direction of the rotation unit and the number of the recognition block which passes the sensing area.
To sum up, compared to a conventional optical navigation sensor, the optical navigation sensor, the electronic device and the operation method provided by the present disclosure utilize the navigation unit to determine an amount of displacement of the rotation unit, and utilize the edge detection unit to detect the recognition block disposed on the surface of the rotation unit. By the rotation unit and the edge detection unit, the optical navigation sensor provided by the present disclosure can calculate the amount of displacement of the rotation unit more accurate, such that a back-end circuit can perform a corresponding action according to the calculated amount of displacement of the rotation unit.
In order to further understand the techniques, means and effects of the present disclosure, the following detailed descriptions and appended drawings are hereby referred, such that, through which, the purposes, features and aspects of the present disclosure can be thoroughly and concretely appreciated; however, the appended drawings are merely provided for reference and illustration, without any intention to be used for limiting the present disclosure.
The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
The aforementioned illustrations and detailed descriptions are exemplarity for the purpose of further explaining the scope of the instant disclosure. Other objectives and advantages related to the instant disclosure will be illustrated in the subsequent descriptions and appended drawings.
Hereinafter, the concept of the present invention may be embodied in many different forms and should not be construed as limited to the exemplary embodiment set forth herein. Rather, the exemplary embodiments are provided so that the instant disclosure will be thorough, complete, and will fully convey the scope of the inventive concept by those skilled in the art. For the purpose of viewing, the relative sizes of layers and regions are exaggerated in all drawings, and similar numerals indicate like elements.
Notably, the terms first, second, third, etc., may be used herein to describe various elements or signals, but these signals should not be affected by such elements or terms. Such terminology is used to distinguish one element from another or a signal with another signal. Further, the term “or” as used herein in the case may include any one or combinations of the associated listed items.
Please refer to
At least one recognition block BK is alternately disposed on the surface of the rotation unit 11. Light reflection coefficients between the surface and the recognition block BK are different. The rotation unit 11 can perform a rotation action. For example, when the rotation unit 11 performs the rotation action, the rotation unit 11 rotates around a center of the rotation unit 11.
In the exemplary embodiment of the present disclosure, the rotation unit 11 is a ring structure. The recognition block BK is disposed on an outer surface of the rotation unit 11. The optical navigation sensor 10 is disposed corresponding to the outer surface of the rotation unit 11.
Due to the light reflection coefficients between the outer surface and the recognition block BK are different from each other, a light intensity of a reflected light reflected by the outer surface is different from a light intensity of a reflected light reflected by the recognition block BK. The optical navigation sensor 10 captures the image based upon the reflected light, and determines a number of the recognition block BK passing a sensing area of the optical navigation sensor 10 when the rotation unit 11 performs the rotation action. Then, the optical navigation sensor 10 calculates an amount of the displacement of the rotation unit 11 in response to the image and the number of the recognition block BK passing the sensing area of the optical navigation sensor 10.
Please refer to
Being different from the electronic device 1 shown in
Next, a distribution of the recognition block BK will be introduced. Please refer to
Notably, the distribution of the recognition blocks BK is not limited to the examples provided in the exemplary embodiment. For example, if the rotation unit includes N recognition blocks BK, the recognition blocks BK are separated by 360/N degrees with each other. From the explanation of the aforementioned exemplary embodiment, those skilled in the art should be able to deduce the other exemplary embodiments according to the disclosure of the present disclosure, as long as each neighboring two recognition blocks BK are separated from each other in the same angle, and further descriptions are therefore omitted.
Next, an optical navigation sensor will be introduced. Please refer to
The light-emitting unit 100, such as a light-emitting diode (LED), is configured for operatively providing a light beam to irradiate a surface of a rotation unit (not shown in
The pixel array 101 includes a plurality of pixel units. The pixel array 101 is disposed corresponding to a surface of the rotation unit 11. The pixel array 101 receives a reflected light which the surface of the rotation unit 11 reflects the light beam provided by the light-emitting unit 100, and captures an image in response to the reflected light once every capturing interval, wherein the images are associated with a part of the surface of the rotation unit 11.
The navigation unit 102 is configured for operatively determining a rotation direction when the rotation unit 11 performs the rotation action based upon the images captured by the pixel array, and generating a navigation signal. The navigation signal comprises the rotation direction of the rotation unit 11.
The edge detection unit 103 is configured for operatively receiving the images and the navigation signal outputted by the navigation unit 102, and generating an edge detection signal in response to the images and the navigation signal. The edge detection signal comprises a number of the recognition block BK which passes a sensing area of the pixel array 101 within the rotation action.
The processing unit 104 receives the navigation signal and the edge detection signal, and determines a rotation state of the rotation unit 11 in response to the navigation signal and the edge detection signal to generate a rotation state signal. The rotation state comprises the rotation direction of the rotation unit 11 and the number of the recognition block BK passed the sensing area of the pixel array 101 within the rotation action. The processing unit 104 outputs the rotation state signal to a host 5. The host 5 can be a desk computer, a notebook computer or other types of computers. The host 5 establishes a connection with the optical navigation sensor 10 through a wire transmission or a wireless transmission. After receiving the rotation state signal, the host 5 implements a corresponding function based upon the rotation direction of the rotation unit 11 and the number of the recognition block BK passing the sensing area of the pixel array 101 which are instructed in the rotation state signal. Or, the host 5 can be an embedded controller which is set in the electronic device 1, and the embedded controller generates a control signal in response to the rotation state signal to control associated circuits
It is worth to note that, in another exemplary embodiment, the optical navigation sensor 10 does not include the processing unit 104. Instead, the navigation unit 102 and the edge detection unit 103 directly connect to the host 5 through the wire transmission or the wireless transmission. The navigation unit 102 outputs the navigation signal to the host 5. The edge detection unit 103 outputs the edge detection signal to the host 5. The host 5 determines the rotation state of the rotation unit 11 in response to the navigation signal and the edge detection signal. Similarly, the rotation state comprises the rotation direction of the rotation unit 11 and the number of the recognition block BK passed the sensing area of the pixel array 101 within the rotation action. After determining the rotation state of the rotation unit 11, the host 5 implements the corresponding function based upon the rotation state.
Please refer to
In
Widths of the recognition blocks BK are respectively smaller than a size of the sensing area SA. The navigation unit 102 can determine a rotation direction of the rotation unit 11 according to the position variations of the recognition blocks BK within the sensing area SA. The edge detection unit 103 can calculate a number of the recognition blocks BK which pass the sensing area SA according to the position variations of the recognition blocks BK within the sensing area SA.
Difference between the rotation unit 11′ shown in
In another exemplary embodiment, the widths of the recognition blocks BK, BK′ can also larger than the sizes of the sensing area SA, SA′. If the widths of the recognition blocks BK, BK′ are larger than the sizes of the sensing area SA, SA′, an optical navigation sensor 10 needs a speed sensor to sensing rotation speed of the rotation unit 11, 11′. A processing unit (as the processing unit 2 shown in
Next, steps of determining a rotation state of a rotation unit 11 by an optical navigation sensor 10 will be introduced. Please refer to
In the exemplary embodiment, the rotation unit 11 performs a rotation action from right to left. The optical navigation sensor defines a first direction is from right to left. On the other hand, the optical navigation sensor defines a second direction is from left to right. Please refer to
Please refer to
Please refer to
According to the images shown in
In the exemplary embodiment, when the edge detection unit 103 determines the recognition block BK passes the sensing area SA and the rotation direction of the rotation unit 11 is the first direction, the edge counting value of the edge counter increases. For example, the edge counting value is increased by 1. The edge detection unit 103 generates an edge detection signal based upon the edge counting value, and outputs the edge detection signal to the processing unit 104. Briefly, the edge detection unit 103 can determine how many recognition blocks BK pass the sensing area SA based upon the images captured by the pixel array 101, and generate the edge detection signal.
Please refer to
Please refer to
Please refer to
Please refer to
According to the images shown in
In the exemplary embodiment, when the edge detection unit 103″ determines the recognition block BK″ passes the sensing area SA″ and the rotation direction of the rotation unit 11″ is the second direction, the edge counting value of the edge counter decreases. For example, the edge counting value is decreased by 1. The edge detection unit 103″ generates an edge detection signal based upon the edge counting value, and outputs the edge detection signal to the processing unit 104″.
Incidentally, the first direction is from right to left, and the second direction is from left to right in the exemplary embodiment. However, the first direction and the second direction are not limited to the examples provided by the instant exemplary embodiment. Those skilled in the art can define the first direction and the second direction according to practical demands to complete the present disclosure.
It is worth to note that the processing unit (such as one of the processing units 104, 104″ described above) can reset the edge counting value recorded in the edge counter. When the edge counting value reaches a specific value, the processing unit 104 determines that the rotation unit (such as one of the rotation units 11, 11″ described above) rotates one cycle and returns to an initial rotation position. If the processing unit 104 resets the edge counter, the edge counting value will equal to the initial value. Then, the edge detection unit (such as one of the edge detection units 103, 103″ described above) restarts to calculate the number of the recognition blocks BK passing the sensing areas SA.
The specific value is associated with numbers of the recognition blocks BK disposed on the surface of the rotation unit 11. When there are N recognition blocks BK disposed the surface of the rotation unit 11, the specific value are +N and −N.
For example, when there is one recognition block BK disposed the surface of the rotation unit 11, the specific value is +1 and −1. When the rotation unit 11 performs the rotation action with the first direction, and the recognition block BK passes the sensing area SA, the edge counting value recorded in the edge counter is increased by 1. The edge counting value changes from 0 to 1. After receiving the edge detection signal instructing the edge counting value outputted by the edge detection unit 103, the processing unit 104 determines that the rotation unit 11 rotates one cycle with the first direction. Then, the processing unit 104 commands the edge detection unit 103 to reset the edge counting value recorded in the edge counter.
On the other hand, when the rotation unit 11 performs the rotation action with the second direction, and the recognition block BK passes the sensing area SA, the edge counting value recorded in the edge counter is decreased by 1. The edge counting value changes from 0 to −1. After receiving the edge detection signal illustrating the edge counting value outputted by the edge detection unit 103, the processing unit 104 determines that the rotation unit 11 rotates one cycle with the second direction. Then, the processing unit 104 commands the edge detection unit 103 to reset the edge counting value recorded in the edge counter.
Briefly, whether the rotation unit 11 performs the rotation action with the first direction or the second direction, when the rotation unit 11 rotates one cycle, the edge counting value recorded in the edge counter is reset. Through resetting the edge counting value, the optical navigation sensor 10 can reduce the problem that the number of the recognition block BK passing the sensing area SA calculated by the optical navigation sensor 10 does not match to an actual number of the recognition block BK which passes the sensing area SA within the rotation action, because a cumulative calculation error causing by a deviation between the calculated edge counting value and the actual number of the recognition block BK which passes the sensing area SA is improved by resetting the edge counting value every cycle.
For example, the above optical navigation sensor 10 can be used as a volume control knob of a sound system. The rotation unit 11 performing the rotation action with the first direction means bringing the volume up, and the rotation unit 11 performing the rotation action with the second direction means bringing the volume down. The recognition block BK disposed on the surface of the rotation unit 11 is associated with a volume variation of the sound system. A user can adjust a volume of the sound system by rotating the rotation unit 11. According to the number of the recognition block BK passing the sensing area SA of the optical navigation sensor 10, the processing unit 104 generates a volume control signal and outputs the volume control signal to a back-end circuit (such as the host 5 shown in
Please refer to
In this exemplary embodiment, the optical navigation sensor 80 further comprises an image processing unit 805. The image processing unit 805 is disposed between the pixel array 801, the navigation unit 802 and the edge detection unit 803. The pixel array 801 is coupled to the image processing unit 805. The image processing unit 805 is coupled to the navigation unit 802 and the edge detection unit 803.
The image processing unit 805 is configured for operatively receiving images outputted by the pixel array 801, and performs image processing on the images to correspondingly generate second images. The image processing such as image brightness compensation or image format conversion. The image processing unit 805 outputs the second images to the navigation unit 802 and the edge detection unit 803. Then, the navigation unit 802 and the edge detection unit 803 respectively generate a navigation signal and an edge detection signal in response to the second images.
Through performing image processing on the images outputted by the pixel array 801, the optical navigation sensor 80 reduces time used in generating the navigation signal and edge detection signal. Because image sizes of the second images are smaller than the images outputted by the pixel array 801 after image format conversion. Furthermore, when the optical navigation sensor 80 calculates an amount of displacement which the rotation unit moves, accuracy calculated by the optical navigation sensor 80 is increased. Because image resolution of the second images are higher than the images outputted by the pixel array 801 after image brightness compensation.
Please refer to
In step S903, a navigation unit receives the images outputted by the pixel array. After receiving at least two images, the navigation unit determines a rotation direction of the rotation unit based upon a position variation of the recognition blocks in the images, and generates a navigation signal. The navigation signal instructs the rotation direction of the rotation unit. In step S904, an edge detection unit receives the images and the navigation signal, and generates an edge detection signal in response to the images and the navigation signal. The edge detection signal instructs a number of the recognition block which passes a sensing area of the pixel array within the rotation action. In step S905, a processing unit of the electronic device determines a rotation state of the rotation unit in response to the navigation signal and the edge detection signal. The rotation state comprises the rotation direction of the rotation unit and the number of the recognition block passed the sensing area of the pixel array within the rotation action.
Please refer to
In step S1003, the edge detection unit determines whether a recognition block passes a sensing area of the pixel array. If the edge detection unit detects that the recognition block passes the sensing area, then step S1004 is executed. Conversely, if the edge detection unit does not detect that the recognition block passes the sensing area, then step S1001 is executed, and the edge detection unit continues receiving the images and the navigation signal. In step 1004, the edge detection unit determines a rotation direction of a rotation unit in response to the navigation signal. If the navigation signal instructs that the rotation direction is a first direction, then step S1005 is executed. If the navigation signal instructs that the rotation direction is a second direction, then step S1006 is executed.
In step S1005, an edge counting value recorded in an edge counter of the edge detection unit increases. In step S1006, the edge counting value recorded in the edge counter of the edge detection unit decreases. In step S1007, the edge detection unit generates an edge detection signal according to the edge counting value recorded in the edge counter.
In summary, compared to a conventional optical navigation sensor, the optical navigation sensor, the electronic device and the operation method provided by the present disclosure utilize the navigation unit to determine an amount of displacement of the rotation unit, and utilize the edge detection unit to detect the recognition block disposed on the surface of the rotation unit. By the rotation unit and the edge detection unit, the optical navigation sensor provided by the present disclosure can calculate the amount of displacement of the rotation unit more accurate, such that a back-end circuit can perform a corresponding action according to the calculated amount of displacement of the rotation unit.
The above-mentioned descriptions represent merely the exemplary embodiment of the present disclosure, without any intention to limit the scope of the present disclosure thereto. Various equivalent changes, alternations or modifications based on the claims of present disclosure are all consequently viewed as being embraced by the scope of the present disclosure.