1. Field of the Invention
The present invention relates to input devices having a fingerprint reading sensor, and more particularly to an input device having a pointing device function and to a pointer control method.
2. Description of the Related Art
Conventionally, a device provided with various functions including fingerprint recognition has been proposed. For example, in Japanese Laid-Open Patent Application No. 11-283026, the touch pad having a finger reading function and an information processing apparatus are disclosed. In this application, a description is given of a case where a sensor for detecting a fingerprint and a position sensor for detecting the position which a finger touches, which conventionally are provided in separate units, are provided in a single unit, and position data in the vicinity of the center of the finger touching the sensor is sent as coordinate information to a computer. Also, Japanese Laid-Open Patent Application No. 4-158434 discloses a pointing device of a display apparatus. This application discloses the invention of moving a cursor on a display based on the moving direction and moving distance of a finger that are obtained by fiber print patterns detected at regular time intervals. In addition, Japanese Laid-Open Patent Application No. 10-275233 discloses an information processing system, pointing device and information processing apparatus. The application gives a description of the invention of determining the moving position of a pointer based on the difference between a fingerprint image and a verified image detected by using Fourier transformation.
As described above, in the conventional input device, the pointer on the display is controlled by detecting the difference of the movement of the finger.
However, in the method of detecting the movement of the finger, there is a disadvantage in the operability since the finger must be actually moved, and at the same time, a larger device is required since a sensing part is required, resulting in cost increase.
It is a general object of the present invention to provide an improved and useful input device and a pointer control method in which the above-mentioned problems are eliminated.
It is another and more specific object of the present invention to provide an input device and a control method that can control a pointer without actually moving a finger and detect a click operation.
In order to achieve the above-mentioned objects, according to one aspect of the present invention, there is provided an input device that transmits, to a host device having a display displaying a pointer, information relating to the pointer and includes a sensor detecting contact information including: an area extracting part extracting a contact area of the sensor and a finger based on the contact information detected by the sensor; and a control part controlling the pointer based on the contact area extracted by the area extracting part.
According to the present invention, an input device that extracts the contact area by the area extracting part extracting the contact area and controls the pointer using the contact area of the finger and the sensor is provided.
Also, the input device may include a first control part that detects the position of the barycenter of the contact area or the position of the center of the contact area, and controls the pointer by using the detected position.
In addition, the input device may include a click detecting part that detects a click based on a detected time by using the condition that the barycenter or center cannot be detected when the finger is not touching the sensor. It should be noted that the click refers to an operation of pressing a sensor surface (sensor part) once.
Further, the input device may include a second control part that detects, from the contact area, the area (size) thereof, the length of the outline thereof, and the length of the sides of a rectangle circumscribing the contact area. By the second control part, it is also possible to determine the moving speed of the pointer based on the detected amount (the area, the length of the outline, and the length of the sides). Moreover, it is also possible to detect the click operation according to the detected amount and further to detect the movement of the finger in a direction almost perpendicular to the contact area.
The information (contact area information) of the contact area used as mentioned above may be reduced or compressed with respect to predetermined information. Thus, it is possible to improve the processing speed.
In addition to the above-described functions as a pointing device, as a fingerprint identification function, the input device according to the present invention may include fingerprint dictionary information that records fingerprint information beforehand. Thus, it is possible to identify the fingerprint by using the fingerprint dictionary information. Further, it is possible to record a plurality of fingerprints in the fingerprint dictionary information, for example, the first and second fingers of one hand for each user. By recording the plurality of fingerprints, it is possible to automatically switch several operation modes of the input device.
According to the present invention, it is possible to provide an input device and a pointer control method that can control the pointer without actually moving the finger and detect the click operation.
Other objects, features and advantages of the present invention will become more apparent from the following detailed description when read in conjunction with the following drawings.
A description will be given of embodiments of the present invention, by referring to the drawings.
As shown in
Next, a description will be given of the function of the input device 10, with reference to FIG. 5.
Next, a description will be given of the above-mentioned sensor part 12 and A/D converter 14. First, the sensor part 12 scans the sensor surface and detects the contact information of the finger by analog data. The A/D converter 14 converts the analog data detected by the sensor part 12 into digital data. The fingerprint information on which the digital conversion is performed is output as contact information using boxes obtained by dividing the part where the finger touches as shown in
A description will be given of an example of the contact information, by referring to FIG. 7. When the finger touches the sensor part 12, as shown in
A description will be given of the position/direction detecting part 28. The position/direction detecting part 28 has the function of detecting, from the contact area (fingerprint pattern) obtained from the contact information, the position/direction moving amount that is the absolute coordinate value on the display unit 50. There are two kinds of patterns of the position/direction detecting part 28 depending on the combination of data used for detecting the absolute coordinate value. In the following, the two patterns will be explained in order.
First, a description will be given of a barycenter pattern using the barycenter of the contact area (fingerprint pattern), with reference to FIG. 8.
Next, a description will be given of the processes of the area detecting part 34 and the barycenter position detecting part 30, in the order of the detection of the area and the determination of the barycenter. First, in above-mentioned
dij of the Formula 1 is the identification result of whether the box is in the contact area or the non-contact area. At the same time, paying attention to the fact that the area of each of the boxes is 1, the total area of the boxes determined to be included in the contact area, that is, an area (size) S of the contact area is as follows.
As described above, the area detecting part 34 obtains the area S of the contact area. Then, projection values Tx1 and Ty1 are obtained for the X-axis and Y-axis, respectively, so as to obtain the barycenter by using the area S. The definitional formulas of the projection values Tx1 and Ty1 are represented by Formulas 3 and 4, respectively.
Since the right sides of these equations are the sums of Pij that are determined as the contact area, Formula 3 represents the sum of the areas of the boxes of which X coordinate is i, and Formula 4 represents the sum of the areas of the boxes of which Y coordinate is j. Additionally, in Formulas 3 and 4, there are m projection values Tx1 and n projection values Ty1. The box corresponding to the barycenter position is defined by Formula 5 from the obtained projection values Tx1 and Ty1.
Here, α1 and βj are coefficients and sequences defined as α1=i and βj=j, for example. In addition, the above X coordinate and Y coordinate are weighted averages respectively obtained by dividing each of the sums of the αi times projection value Tx1 and βj times projection value Tyi by the area S.
Referring to the flow chart of
A description will be given of the flow chart. In the first step S101, the value of the area S, the projection value X, the projection value Y and a loop counter j are initialized. Next, in step S102, it is determined whether or not the loop counter j exceeds n. The loop counter j is the counter counting the number of the boxes arranged in the Y-axis direction. Thus, when j exceeds n that is the number of the boxes arranged in the Y-axis direction, the process of obtaining Tyj and S is finished, and the process proceeds to the next block, which will be described later, where the projection value Tx1 is obtained. When j is equal to or less than n in step S102 (NO in step S102), the process proceeds to step S103 since the process of obtaining Tyj and S is continued. In step S103, initialization of a loop counter i is performed. The loop counter i is a counter counting the number of the boxes arranged in the X-axis direction. In step S104, as is the case with j, whether or not i exceeds m is determined. In a case where i exceeds m that is the number of the boxes arranged in the X-axis direction (YES in step S104), the case indicates that the process relating to the boxes of which Y-coordinate is j ends. Thus, the process proceeds to step S108 so as to perform the process of “j+1”th boxes. When i is equal to or less than m (NO in step S104), in step S105, whether or not the shading value Pij is smaller than the threshold value C is determined. When Pij is smaller than C (YES in step S105), since the box (i, j) is not in the contact area (fingerprint pattern), no process is performed, and the process proceeds to step S109 so as to performs the process of “i+1”th box. When Pij is equal to or larger than C (NO in step S105), the box (i, j) is in the contact area. Accordingly, the processes of obtaining the area S and the projection value Tyj, which are the objects of the process block, are performed in steps S106 and S107, respectively. Step S106 is the process of obtaining the area S. In step S106, the area 1 of the box is added to the area S. In addition, step S107 is the process of obtaining the projection value Tyj. Thus, in step S107, Pij is added to Tyj (refer to Formula 4). When the above-described process ends, the process proceeds to step S109 so as to perform the process of “i+1”th box. When the area S and the projection value Y are obtained in this manner, the process branches off to step S110 at step S102, and the process block of obtaining the projection value X is performed.
In the process block of obtaining the projection value X, as shown in the Formula 3, the process of obtaining the sum of Pij relating to i is performed by fixing i. When Tx1 is obtained, the sum is obtained again by incrementing i. This process is repeated until the condition i>m is satisfied. Therefore, first, in step S110, the loop counter i is initialized again. Next, in step S111, whether or not i exceeds m is determined. The loop counter i is the counter for counting the number of the boxes arranged in the X-axis direction. Thus, when i exceeds m that is the number of the boxes arranged in the X-axis direction (YES in step S111), the projection value X is obtained. In the first process block, the projection value Y and the area S are obtained. Accordingly, the barycenter position is obtained in step S118 (refer to Formula 5). In step S112, initialization of the loop counter j is performed. The loop counter j is a counter counting the number of the boxes arranged in the Y-axis direction. Similar to i, it is determined whether or not j exceeds n in step S113. Then, as in the case with i, when j exceeds n that is the number of the boxes arranged in the Y-axis direction (YES in step S113), the process of the “i”th boxes in the Y-axis direction is finished. Thus, the process proceeds to step S117 so as to perform the process of “i+1”th boxes. When j is equal to or less than n in step S113 (NO in step S113), in the next step S114, the above-described comparison between the shading value Pij and the threshold value C is performed. When Pij is smaller than C (YES in step S114), since the box (i, j) is the non-contact area, no process is performed and the process proceeds to step S116 for performing the process of “j+1”th boxes. When Pij is equal to or larger than C (NO in step S114), since the box (i, j) is the contact area, Pij is added to Txi in step S115 so as to obtain the projection value Tx1 that is the object of the process block (refer to Formula 3). When the projection value X is obtained in this manner, the process branches off to step S118 at step S111, the barycenter position is calculated in step S118, and thus the barycenter is obtained.
As mentioned above, in the barycenter pattern, the coordinate position is detected by obtaining the barycenter position of the contact area. The thus detected coordinate position is, as shown in
Next, a description will be given of a center pattern in which the coordinate position is detected by using a center position instead of the barycenter position. As shown in
First, prior to a description of a flow chart showing the process in which the center position detecting part 58 detects the center, the “center” in this embodiment is defined.
To begin with, a collection of the boxes determined as the contact area is represented by M.
M={(i,j)|Pv≧C} (Formula 6)
The maximum value of i in the boxes belonging to the collection M is represented by xmax.
xmax=max{i|(i,j)εM} (Formula 7)
The minimum value of i in the boxes belonging to the collection M is represented by xmin.
xmin=min{i|(i,j)εM} (Formula 8)
The maximum value of j in the boxes belonging to the collection M is represented by ymax.
ymax=max{j|(i,j)εM} (Formula 9)
The minimum value of j in the boxes belonging to the collection M is represented by ymin.
yminmin{j|(i,j)εM} (Formula 10)
Thus obtained xmax, xmin, ymax and ymin correspond to the respective values shown in FIG. 12. From these xmax, xmin, ymax and ymin, the center is defined as follows.
A description will be given of the above-described process of obtaining the center, with reference to the flow chart in FIG. 14. First, xmin (a starting position in X-axis direction) is obtained in the first step S201. Step S201 is a function of the C language, for example, having xmin as a returned value. In a case of the non-contact area where xmin is not calculated, a process having “−1” as the returned value is performed. Then, step S202 determines whether or not the contact area does not exist. When the result in step S201 is “−1”, the result indicates that all boxes are in the non-contact area. Accordingly, in step S202, it is determined that there is no contact area (YES in step S202), and the process ends. When xmin is calculated in step S201, since it is determined that the contact area exists in step S202 (NO in step S202), a process of calculating xmax (an ending position in X-axis direction) is performed in step S203, and the process relating to the X-axis ends. Similarly, regarding the Y-axis, ymax (a starting position in Y-axis direction) and xmin (an ending position in Y-axis direction) are calculated in steps S204 and S205, respectively. When xmax, xmin, ymax and ymin are calculated in this manner, by obtaining the midpoint (center position X) of xmax and xmin in step S206 and obtaining the midpoint (center position Y) of ymax and ymin in step S207, it is possible to obtain the center of the Formula 11.
Among the above-described processes, a description will be given of examples of the processes in steps S201 and S203, by referring to
Next, a description will be given of the process of step S203 in which xmax is obtained, by referring to the flow chart of FIG. 16. In step S401, the loop counter i is initialized to m. In the next step S402, whether or not i is less than 0 is determined. The loop counter i is the counter for counting from m to 0. Thus, when i becomes less than 0 (YES in step S402), it is considered that no contact area exists in step S406, and the process ends. When i is larger than 0 (NO in step S402), whether or not Tx1 is 0 is determined in step S403. In this determination process, when the projection value Tx1 is 0 (YES in step S403), the contact area is not detected in the “i”th boxes. Accordingly, i is decremented in step S404, and the process of step S402 is repeated. In a case where the projection value Txi is not 0 (NO in step S403), the contact area is detected for the first time in the “i”th boxes. Thus, in step S405, i is substituted in xmax, and the process ends.
As described above, in this pattern (the center pattern), the coordinate position is detected by obtaining the center position. The thus detected coordinate position is, as shown in
Next, a description will be given of a determination method of a click operation, that is, determining whether a click or a double-click is performed, by using the detection of the barycenter or the center, with reference to FIG. 17.
In the graph of
First, a description will be given of the determination of a click. The above-described “click-ON time” refers to the time that is compared with the time until the barycenter is not detected anymore after the barycenter is detected. If the time until the barycenter is not detected anymore after the barycenter is detected is shorter than the “click-ON time”, it is determined that a click is made. Therefore, in a case where Ton1 is the time when the barycenter is detected and Toff1 is the time when the barycenter is not detected anymore, when the inequality 0<Toff1−Ton1<“click-ON time” is satisfied, it is determined that a click is made.
Next, a description will be given of the determination of a double-click. A double-click means that two clicks are successively detected. When the interval between the two clicks is too long, the two clicks are determined not to be a double-click. The “click-OFF time” is the time limit of the interval between the two clicks. Thus, the determination of a double-click is made by three determinations: two determinations of click and an interval determination between clicks. The determination inequalites are:
0<Toff1−Ton1<click-ON time (the first click determination)
0<Ton2−Toff1<click-OFF time (interval determination of two clicks)
0<Toff2−Ton2<click-ON time (the second click determination)
where Ton2 is the time when the barycenter is detected, and Toff2 is the time when the barycenter is not detected anymore. When the three inequalities are all satisfied, the two clicks are determined to be a double click.
In the detection of a click, it is also possible to detect the click by detecting the contact area by not performing the detection (calculation) of the barycenter or the center. In this case, “with barycenter” and “without barycenter” in
Next, in addition to the above-described click determination by the detection of barycenter, a description will be given of a method of click determination by the area, with reference to the flow chart extending across
Next, a description will be given of the flow chart. First, in step S501, initialization of Ton1, Toff1, Ton2, Toff2, T and 0 are performed. Next, in step S502, the image of the contact area is input, and whether or not S is equal to or larger than Sc is determined in step S503. This determination determines whether or not the finger is touching the sensor part 12. When S is smaller than Sc (NO in step S503), it is determined that the finger is not touching the sensor part 12. Thus, in step S504, the time counter T is incremented, and the process of step S502 is repeated. When S is equal to or larger than Sc (YES in step S503), it is determined that the finger is touching the sensor part 12, and the value of the time counter T is substituted in Ton1 in step S505. Next, in step S506, the image of the contact area is input again, and in the next step S507, whether or not S is smaller than Sc is determined. This determination determines whether or not the finger is touching the sensor part 12. When S is equal to or larger than Sc (NO in step S507), it is determined that the finer is touching the sensor part 12. Accordingly, in step S508, the time counter T is incremented, and the process of step S506 is repeated. When S is smaller than Sc (YES in step S507), it is determined that the finger is not touching the sensor part 12, and the value of the counter T is substituted in Toff1 in step S509. At this moment, since Ton1 and Toff1 are obtained, the first click determination is performed in step S510. Similar to the click determination according to the barycenter, the click determination of step S510 determines whether or not “Toff1−Ton1<click-ON time” is satisfied. When not satisfied (NO in step S510), neither a click nor a double-click is performed. Thus, in step S522, the normal position/direction detecting process is performed. When “Toff1−Ton1<click-ON time” is satisfied (YES in step S510), it is confirmed that a click is made. Then, in step S511, the image of the contact area is input again, the area S is obtained, and whether or not the area S is equal to or larger than Sc is determined in step S512. When S is smaller than Sc (NO in step S512), since the finger is not touching the sensor part 12, the time counter T is incremented in step S513, and the process of step S511 is repeated. When S is equal to or larger than Sc (YES in step S512), the value of the counter T is substituted in Ton2 in step S514. Since Ton2 is obtained here, whether or not “Ton2−Toff1<click-OFF time” is satisfied is determined in step S515 (the interval determination of two clicks). When “Ton2−Toff1<click-OFF time” is not satisfied (NO in step S515), since the interval is long, it is determined that the click is a single click in step S516, and the process proceeds to step S501 so as to prepare for the next process. When “Ton2−Toff1<click-OFF time” is satisfied (YES in step S515), the image of the contact area is input again in step S517, the area S is obtained, and whether or not the area S is smaller than Sc is determined in step S518. When S is smaller than Sc (YES in step S518), the finger is not touching the sensor part 12. Thus, in step S519, the value of the time counter T is substituted in Toff2 in step S519. When S is not smaller than Sc (NO in step S518), the time counter T is incremented in step S523, and the process of step S517 is repeated. At this moment, since Ton2 and Toff2 are obtained, whether or not a click is made is determined in step S520 (the second click determination). Similar to the click determination according to the barycenter, the click determination of step S520 determines whether or not “Toff2-Ton2<click-ON time” is satisfied. When not satisfied (NO in step S520), since neither a click nor a double-click is made, the normal position/direction detecting process is performed in step S522. When “Toff2−Ton2<click-On time” is satisfied (YES in step S521), in step S521, it is determined that a double-click is made, and the process returns to step S501 so as to prepare for the next process.
As mentioned above, unlike the determination according to the barycenter or the center, in the determination according to the area, it is possible to perform the determination of a click and a double-click even though the finger is not completely taken off from the sensor part 12.
Here, a description will be given of a determination method of the moving speed of a mouse cursor. As mentioned above, the determination of the moving speed of the mouse cursor is performed by the moving speed determining part 36 according to the area obtained by the area detecting pat 34 (refer to FIG. 8). In the following, the determination method of the moving speed is explained.
To begin with, a description will be given of a general overview of the determination method. In the determination method, since the speed is determined according to the area, the area at the lowest speed and the area at the highest speed are set beforehand. Then, based on the set two areas, the speed at the detected area is determined.
A detailed description will be given of the process of the determination method.
First, in step S1401, the MCU 18 requests, from the area detecting part 34, the value of a fingerprint area at SP1 that is the lowest speed when a finger number is n. It should be noted that the finger number n refers to the number appointed to each finger, such as 1 through 5 for the little finger to the thumb of the right hand.
Next, in step S1402, the MCU 18 determines whether or not to set the area value. When the MCU 18 determines to set the area value (YES in step S1402), the process proceeds to step S1404. When the area value is not set (NO in step S1402), in step S1403, it is determined whether or not to retry requesting the area value. Then, when the retry is not performed (No in step S1403), the MCU 18 ends the process. When the retry is performed (YES in step S1403), the process returns to step S1401.
In step S1402, when the MCU 18 determines to set the area value of the fingerprint (hereinafter referred to as the “fingerprint area value”), in step S1404, the MCU 18 sets a fingerprint area value Fn1 at SP1 to the fingerprint dictionary data 24.
Here, a description will be given of the fingerprint dictionary data 24. As shown in
The description of the flow chart will now be continued. In step S1405, the MCU 18 requests, from the area detecting part 34, the fingerprint area value Fn2 at SP2, which is the highest speed when the finger number is n.
Then, in step S1406, the MCU 18 determines whether or not to set the area value. When the MCU 18 determines to set the area value (YES in step S1406), in step S1407, the MCU 18 sets the fingerprint area value Fn2 at SP2 to the fingerprint dictionary data 24 and ends the process.
The moving speed is determined by using thus set fingerprint area values at the lowest and highest speeds. A description will be given of this determination method, with reference to FIG. 42.
In the graph shown in
The moving speed is determined according to the flow chart of
First, in order to detect a current fingerprint area value, in step S1501, the MCU 18 retrieves (calls) the fingerprint area value having the finger number n from the area detecting part 34. Then, in step S1502, the MCU 18 retrieves the lowest speed SP1 and the highest speed SP2 that are set earlier. In addition, in step S1503, the MCU 18 also retrieves the fingerprint area value Fn1 at the lowest speed SP1 and the fingerprint area value Fn2 at the highest speed SP2.
Next, in step S1504, the MCU 18 compares the current fingerprint area value and the fingerprint area value Fn1 at the lowest speed SP1. In a case where the current fingerprint area value is smaller than the fingerprint area value Fn1 at the lowest speed SP1 (YES in step S1504), the case corresponds to the segment L2 of FIG. 42. Accordingly, in step S1505, the moving speed is set to SP1.
In step S1506, the MCU 18 compares the current fingerprint area value with the fingerprint area value at the highest speed SP2. In a case where the current fingerprint area value is larger than the fingerprint area value Fn2 at the highest speed SP2 (YES in step S1506), the case corresponds to the segment L3. Thus, in step S1507, the moving speed is set to SP2.
Step S1508 is a process corresponding to a case where the current fingerprint area value is in a section between Fn1 and Fn2. In addition, step S1508 is a process of determining the moving speed corresponding to the segment L1 of FIG. 42.
As shown in step S1508, the moving speed in this case is as follows.
(SP2−SP1)×(fingerprint area value)/(Fn2−Fn1)+(SP1×Fn2−SP2×Fn1)/(Fn2−Fn1)
The equation of the line connecting (Fn1, SP1) and (Fn2, Sp2) is:
Y−SP1=(SP2−SP1)/(Fn2−Fn1)×(X−Fn1).
Accordingly, by substituting the fingerprint area value in X of the above equation and changing the equation, the equation shown in step S1508 is obtained. Hence, it is possible to determine the moving speed by using the fingerprint area value. Additionally, since the fingerprint dictionary data 24 records a speed parameter for each fingerprint, it is possible to choose the speed by the finger, such as move the mouse cursor fast in a case of the first finger and to move the mouse cursor slowly in a case of the second finger.
Furthermore, the moving speed may be determined according to the graph shown in
The formula for determining the moving speed in the segment L1 of
(SP3−SP2)×(fingerprint area value)/ (Fn3−Fn2)+(SP2×Fn3−SP3×Fn2)/(Fn3−Fn2)
Besides the determining method of the moving speed using the plurality of segments as mentioned above, the moving speed may be determined by using a quadric curve and an exponential curve, for example.
Next, a description will be given of a method of determining the moving speed by detecting the outline of the fingerprint or the XY width.
Conventionally, when obtaining the outline of a banded image such as a fingerprint image, a differential filter such as 3×3 pixels matrix has been used after filling spaces between the bands. Thus, the process is complex. To eliminate the problem, the present invention uses a method of extracting the outline of the fingerprint relatively easily. In the position/direction detecting part 28 of
Next, a description will be given of the outline, with reference to FIG. 20. In
Next, the lengths of the segments are obtained. For that purpose, by taking the boxes A, B, C and D as normal coordinates, the sum of the lengths of a segment AB, a segment BC, a segment CD and a segment DA is made the length of the outline. A description will be given of a specific method of obtaining the sum of the lengths, by referring to flow charts. First, a description will be given of a method of obtaining the coordinates of the box A, by referring to the flow chart in FIG. 22. In step S601, the loop counter j is initialized with 0. In step S602, whether or not j is larger than n is determined. When j exceeds n (YES in step S602), it is determined that there is no contact area in step S608, and the process ends. When j is equal to or less than n (NO in step S602), the loop counter i is initialized with 0 in step S603. In step S604, whether or not i is larger than m is determined. When i is larger than m (YES in step S604), j is incremented in step S607, and the process of step S602 is repeated. When i is equal to or smaller than m (NO in step S604), in step S605, whether or not the shading value Pij is smaller than the threshold value C is determined. When Pij is smaller than C (YES in step S605), since the box is the non-contact area, i is incremented in step S606, and the process of step S604 is repeated. When Pij is equal to or larger than C (NO in step S605), it is regarded that the contact area is found for the first time. Thus, i and j at the moment are substituted in xc and ymin, respectively, and thus the coordinates of the box A is obtained.
Next, a description will be given of a method of obtaining the coordinates of the box B, with reference to the flow chart in FIG. 23. First, in step S701, the loop counter i is initialized with 0. In step S702, whether or not i is larger than m is determined. When i exceeds m (YES in step S702), it is regarded that there is no contact area in step S708, and the process ends. When i is equal to or less than m (NO in step S702), in step S703, the loop counter j is initialized with 0. In step S704, whether or not j is larger than n is determined. When j is larger than n (YES in step S704), i is incremented in step S707, and the process of step S702 is repeated. When j is equal to or less than n (NO in step S704), in step S705, whether or not the shading value Pij is smaller than the threshold value C is determined. When Pij is smaller than C (YES in step S705), since the box is the non-contact area, j is incremented in step S706, and the process of step S704 is repeated. When Pij is equal to or larger than C (NO in step S705), it is considered that the contact area is found for the first time. Accordingly, i and j at the time are substituted in xmin and ya, respectively, and thus the coordinates of the box B is obtained.
Next, a description will be given of a method of obtaining the coordinates of the box C, by referring to the flow chart in FIG. 24. First, in step S801, the loop counter j is initialized with n. In step S802, whether or not j is smaller than 0 is determined. When j is smaller than 0 (YES in step S802), the box is considered as the non-contact area in step S808, and the process ends. When j is equal to or larger than 0 (NO in step S802), the loop counter i is initialized with 0 in step S803. In step S804, whether or not i is larger than m is determined. When i is larger than m (YES in step S804), j is decremented in step S807, and the process of step S802 is performed. When i is equal to or smaller than m (NO in step S804), whether or not the shading value Pij is smaller than the threshold value C is determined in step S805. When Pij is smaller than C (YES in step S806), since the box is non-contact area, i is incremented in step S806, and the process of step S804 is repeated. When Pij is equal to or larger than C (NO in step S806), it is considered that the contact area is found for the first time. Thus, i and j at the time are substituted in xd and ymax, respectively, and thus the coordinate of the box C is obtained.
Next, a description will be given of a method of obtaining the coordinate of the box D, with reference to the flow chart in FIG. 25. First, in step S901, the loop counter i is initialized with m. In step S902, whether or not i is smaller than 0 is determined. When i is smaller than 0 (YES in step S902), it is considered that there is no contact area in step S908, and the process ends. When i is larger than 0 (NO in step S902), the loop counter j is initialized with 0 in step S903. In step S904, whether or not j is larger than n is determined. When j is larger than n (YES in step S904), i is decremented in step S907, and the process of step S902 is performed. When j is equal to or smaller than n (NO in step s904), whether or not the shading value Pij is smaller than the threshold value C is determined in step S905. When Pij is smaller than C (YES in step S905), the box is the non-contact area. Thus, j is incremented in step S906, and the process of step S904 is repeated. When Pij is equal to or larger than C (NO in step S905), it is considered that the contact area is found for the first time, and the i and j at the time are substituted in xmax and yb, respectively, and thus the coordinate of the box D is obtained.
Using the above-described processes, a description will be given of a process of obtaining the length of the outline, with reference to the flow chart of FIG. 26. Steps S1001 through S1004 are the above-mentioned processes of obtaining the boxes A, B, C and D. Step S1001 corresponds to the flow chart of FIG. 23. Step S1002 corresponds to the flow chart of FIG. 25. Step S1003 corresponds to the flow chart of FIG. 22. Step S1004 corresponds to the flow chart of FIG. 24. In step S1005, the length of the outline is obtained from the thus obtained boxes A, B, C and D. In step S1005, L1=AB, L2=DA, L3=CD and L4=BC. The measuring method of the distances may be any measuring method that satisfies the definition of the distance in a metric space such as the distance defined in the Euclidean space. After obtaining L1, L2, L3 and L4 in step S1005 by using such a measuring method, a sum (the length of the outline) L of L1, L2, L3 and L4 is calculated in step S1006, and thus the outline is obtained. In this manner, it is possible to obtain the length of the outline more simply than using a differential filter. Additionally, the length L is reflected on the moving speed of the mouse cursor.
Next, a description will be given of a case where the XY width is used, by referring to the block diagram of FIG. 27. In the position/direction detecting part 28 of
By the above-described two processes, it is possible to detect the size of the fingerprint pattern relatively simply. Therefore, it is possible to reduce the processing load of the input device 10 and to respond to the movement of the finger without delay.
In addition, by the above-described two detection methods and the method of detecting the area, it is possible to detect the movement in a direction perpendicular to an X-Y surface as well as the movement of the finger in the X-Y direction. Accordingly, by providing a perpendicular position calculating part 64 (
For example, in a state where the finger is lightly touching the sensor part 12, when the user presses the finger against the sensor part 12 strongly, the contact area increases. By detecting the area, length, XY width that increase concurrently with the increase of the contact area by the input device 10, it is possible to zoom up a 3-dimension image displayed on the display unit 50, for example.
Next, a description will be given of the thinning of data and the compression of data, as methods of reducing the processing load and shortening the processing time.
According to the above-described process (flow chart of FIG. 30), it is possible to reduce not only the data amount but also the calculation amount to 25% through a simple calculation. In addition, as shown in
Next, a description will be given of the compression. The compression of fingerprint data is performed such that the fingerprint data on which digital conversion is performed by the A/D converter 14 is compressed by using a comparator. For example, as shown in
A description will be given of the compression process, with reference to the flow chart of FIG. 33. The process of step S120 is the initialization of the loop counter j with 0. In step S1202, whether or not j is larger than n is determined. As a result of the comparison between j and n, in a case where j is larger than n (YES in step S1202), since the case indicates that all processes end, the process ends. When j is equal to or smaller than n (NO in step S1202), the loop counter i is initialized with 0 in step S1203. In the next step S1204, whether or not i is larger than m is determined. As a result of the comparison between i and m, in a case where i is larger than m (YES in step S1204), the case indicates that the process relating to “j”th boxes ends. Accordingly, in step S1207, j is incremented so as to perform the process of “j+l”th boxes. When i is equal to or smaller than m (NO in step S1204), in step S1205, the pixel data of the box in the position (i, j) is read by compressing the data into 1/64 (256 gradations to 4 gradations). Thereafter, in step S1206, i is incremented.
According to the above-described process (the flow chart of FIG. 33), it is possible to reduce the data amount to 1/64. As shown in
Additionally,
Next, a description will be given of switching of operation modes of the input device 10. There are four kinds of operation modes: an authentication process mode, an absolute position process mode, a relative position process mode, and a Z-position (perpendicular position) process mode. In each of the above-described modes, the part performing the process corresponding to the mode is activated. For example, in the Z-position mode, the perpendicular position calculating part 64 is activated. The switching of the modes is performed in three kinds of ways, that is, a case where a switch is newly provided for the input device 10 and the switching is performed by the switch, a case where the switching is performed by the computer 22, and a case where the switching is performed by the kind of finger authenticated by the sensor part 12. In the case where the switch is provided for the input device 10, as shown in
Next, in the case where the operation mode is selected by the computer 22, the operation mode is determined by the kinds of the function keys on a keyboard equipped with the computer 22.
Next, a description will be given of a case where the operation mode is selected according to the, kind of finger authenticated by the sensor part 12. Here, the kind of finger refers to such as the first finger and the thumb. The input device 10 performs the various operation modes by determining the kind of finger using the fact that each finger has a different fingerprint. Thus, in the input device 10 of this case, as shown in
Under the above-described construction, the first finger, the second finger and the third finger of the right hand of the user of the input device 10 are input to the fingerprint dictionary data 24 beforehand. Further, each of the fingers corresponds to one of the operation modes. For example, the right first finger corresponds to the absolute position process mode, the right second finger corresponds to the relative position process mode, and the right third finger corresponds to the scroll operation. As mentioned above, by inputting (storing) the fingerprints and the operation modes in the fingerprint dictionary data 24 such that each of the fingerprints corresponds to one of the operation modes, when the user puts the right first finger, for example, on the sensor part 12 and the authentication ends, the position/direction detecting part 28 operates in the absolute position process mode.
As mentioned above, in the case where the operation mode is selected according to the kind of finger authenticated by the sensor part 12, besides the selection of the operation mode, it is possible to add a variation of an independent operation of the input device 10 such as a scroll operation.
The present invention is not limited to the specifically disclosed embodiments, and variations and modifications may be made without departing from the scope of the present invention.
The present application is based on Japanese priority applications No. 2002-030089 filed on Feb. 6, 2002 and No. 2002-234543 filed on Aug. 12, 2002, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2002-030089 | Feb 2002 | JP | national |
2002-234543 | Aug 2002 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
4896363 | Taylor et al. | Jan 1990 | A |
5109428 | Igaki et al. | Apr 1992 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
6292173 | Rambaldi et al. | Sep 2001 | B1 |
6326950 | Liu | Dec 2001 | B1 |
6360004 | Akizuki | Mar 2002 | B1 |
6603462 | Matusis | Aug 2003 | B2 |
20010036300 | Xia et al. | Nov 2001 | A1 |
20020122026 | Bergstrom | Sep 2002 | A1 |
20030058084 | O'Hara | Mar 2003 | A1 |
Number | Date | Country |
---|---|---|
2243235 | Oct 1991 | GB |
4-156434 | Jun 1992 | JP |
4-158434 | Jun 1992 | JP |
10-275233 | Oct 1998 | JP |
11-183026 | Oct 1999 | JP |
11-283026 | Oct 1999 | JP |
2000057342 | Feb 2000 | JP |
Number | Date | Country | |
---|---|---|---|
20030146899 A1 | Aug 2003 | US |