This application claims priority from Taiwanese application no. 90122445, filed in Taiwan, R.O.C., on Sep. 11, 2001, pursuant to 35 U.S.C. 119(a)–(d).
1. Field of the Invention
The present invention relates in general to a method to detect the movement of an image sensor. In particular, the present invention relates to a method for detecting movement of an image sensor which, by comparing the images captured by the image sensor at different locations, finding the matched images and performing some verification processes, the movement of the image sensor is determined.
2. Description of the Related Art
Normally, the prior arts use block matching to determine the movement of an image sensor. Ideally, using this method can correctly find a matched frame block generated from the matching function, mean squared error (MSE) or mean absolute difference (MAD), then use the matched frame blocks to calculate the movement of the image sensor.
However, there are actually many sources of noise, not only from image and process variation, but also from the environment of the image sensor such as temperature and illumination variation. Therefore, the matching result can only obtains a frame block 12B having a gray scale close to the frame block 12A, wherein the value of the matching function is the minimum. Then, a frame block 14B matching the frame block 14A is obtained from the image 10B by the same way.
In addition, the prior art has to compare all the frame blocks in the original images to determine the movement of the image sensor according to the frame 10A and 10B, that is, the conventional method has to fully search to get the result. Therefore, the conventional method requires using a large amount of data, so the determining rate is slow.
The object of the present invention is to provide a method to detect the movement of an image sensor. When comparing the original image and the following image, the present invention doesn't compare all the frame blocks in the original image with the following image. Rather, it selects a single frame block in the original image to fully search the following image. In addition, the present invention provides three verification methods to ensure the accuracy of the comparison. The three verification methods are cross checking, maximum error threshold, and determining the matched frame block number of the result of the comparison. The present invention quickly and accurately determines the movement of an image sensor using less data and operation.
To achieve the above-mentioned object, the present invention provides a method to detect the movement of an image sensor according to the captured images, including the following steps. An image region is captured from a first image. Then, a first corresponding region matching the first image region is captured from a second image. A second image region is captured from the second image. Then, a second corresponding region matching the second image region is captured from the first image. Finally, the movement of the image sensor is determined according to the first region and the first corresponding region when a first relative distance between the first region and the first corresponding region is the same as a second relative distance of the second captured image region and the second corresponding region, but in the opposite direction.
The present invention will become more fully understood from the detailed description given hereinbelow and the accompanying drawings, given by way of illustration only and thus not intended to be limitative of the present invention.
First Embodiment
The first frame 20A is the image captured by the image sensor in a first location, and the second frame 20B is the image captured by the image sensor when the image sensor moves to a second location. Here, the size of the frame captured by the image sensor is 6×6 units. The frame block 22A is captured from the first frame 20A, wherein the size of the frame block 22A is 2×2 units. Then, frame block matching is performed to find the frame block 22B matching the frame block 22A in the second frame 20B, wherein the value of the matching function is the minimum. In addition, the frame block 22A is near the center of the first image 20A to avoid missing the matched frame block in the second image because the image sensor has moved too far.
Next, a verification process is performed, the present embodiment performing cross checking to verify the determination. The cross checking catches another frame block 24B from the center of the second image 20B and fully searches to find a matched frame block 24A in the first image 20A according to the frame block 24B, then calculates the dislocationment and moving direction of the frame blocks 22B and 22A (represented as the vector 26A) and the dislocationment and moving direction of the frame blocks 24B and 24A (represented as the vector 26B). The movement and moving direction of the image sensor are detected according to the frame blocks 22B and 22A when the vectors 26A and 26B have the same length and opposite directions.
Therefore, the movement and moving direction of the image sensor is detected with less data use.
Second Embodiment
The first frame 20A is the image captured by the image sensor in a first location, and the second frame 20B is the image captured by the image sensor when the image sensor moves to a second location. Here, the size of the frame captured by the image sensor is 6×6 units. The frame block 22A is captured from the first frame 20A, wherein the size of the frame block 22A is 2×2 units. Then, frame block matching is performed to find the frame block 22B matching the frame block 22A in the second frame 20B, wherein the value of the matching function is the minimum. In addition, the frame block 22A is near the center of the first image 20A to avoid missing the matched frame block in the second image because the image sensor has moved too far.
Next, a verification process is performed, the present embodiment performing the maximum error threshold method to verify the determination, which skips the search result when the value of the matching function by comparing the frame block 22B and frame block 22A is larger than a maximum error threshold value because the search result is wrong, which will cause errors in operation.
In this embodiment, if the image sensor has 6 bits resolution and 6×6 unit size, and assuming the tolerable noise is 10%, the maximum error threshold value is 6.4 (2^6×10%), which is definded by mean absolute difference.
Therefore, more correct movement and moving direction of the image sensor is detected.
Third Embodiment
The first frame 20A is the image captured by the image sensor in a first location, and the second frame 20B is the image captured by the image sensor when the image sensor moves to a second location. Here, the size of the frame captured by the image sensor is 6×6 units. The frame block 22A is captured from the first frame 20A, wherein the size of the frame block 22A is 2×2 units. Then, frame block matching is performed to find the frame block 22B matching the frame block 22A in the second frame 20B, wherein the value of the matching function is the minimum. In addition, the frame block 22A is near the center of the first image 20A to avoid missing the matched frame block in the second image because the image sensor has moved too far.
Next, a verification process is performed, the present embodiment ascertaining the result of the comparison by determining the number of the matched frame block. Therefore, if the number of the frame blocks in the second image 20B matching the frame block 22A is more than one, the search result is skipped because the search result is wrong, which will cause errors in operation.
Although embodiments of the present invention skip data which might be wrong, there is no influence on the whole determination result of the image sensor because the frame rate of the image sensor is very high, therefore, user viewability is not affected. As for an optical mouse, although skipping some wrong data causes mark stop, the mark goes to the correct location according to the next frame. However, if the mark of the optical mouse is moved according to the wrong data, the mark will go to the location unexpectly, which confuses the user operating the optical mouse.
Accordingly, the method according to the present invention employs capturing a part of frames and combining the verification methods mentioned above to obtain the movement of an image sensor, which decreases the data and simplifies the determination.
The foregoing description of the preferred embodiments of this invention has been presented for purposes of illustration and description. Obvious modifications or variations are possible in light of the above teaching. The embodiments were chosen and described to provide the best illustration of the principles of this invention and its practical application to thereby enable those skilled in the art to utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the present invention as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally, and equitably entitled.
Number | Date | Country | Kind |
---|---|---|---|
90122445 A | Sep 2001 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
4521773 | Lyon | Jun 1985 | A |
4794384 | Jackson | Dec 1988 | A |
4937666 | Yang | Jun 1990 | A |
5107293 | Sekine et al. | Apr 1992 | A |
5563652 | Toba et al. | Oct 1996 | A |
5661524 | Murdock et al. | Aug 1997 | A |
5729008 | Blalock et al. | Mar 1998 | A |
5734933 | Sekine et al. | Mar 1998 | A |
5838828 | Mizuki et al. | Nov 1998 | A |
5930405 | Chida | Jul 1999 | A |
6172354 | Adan et al. | Jan 2001 | B1 |
6222174 | Tullis et al. | Apr 2001 | B1 |
6281882 | Gordon et al. | Aug 2001 | B1 |
6664948 | Crane et al. | Dec 2003 | B1 |
6809758 | Jones | Oct 2004 | B1 |
6859199 | Shi | Feb 2005 | B1 |
7057148 | Wang | Jun 2006 | B1 |
7079116 | Park et al. | Jul 2006 | B1 |
7085418 | Kaneko et al. | Aug 2006 | B1 |
20030081129 | Lin et al. | May 2003 | A1 |
20050151724 | Lin et al. | Jul 2005 | A1 |
Number | Date | Country |
---|---|---|
WO 9843436 | Oct 1998 | WO |
Number | Date | Country | |
---|---|---|---|
20040201705 A1 | Oct 2004 | US |