The invention relates to an optical flow sensing mechanism, and more particularly to a novel optical flow sensor, method, remote controller device, and rotatable electronic device.
Generally speaking, image sensor mechanisms may be applied to different products or devices such as an optical mouse navigation device, a remote controlling scheme, and/or an unmanned aerial vehicle. In a conventional optical mouse navigation algorithm, no filters or fixed filters are applied on the image to enhance features captured from a working surface. On the other hand, a lighting controller system was applied to light up the working surface when all of sensor pixels are exposable; that is, the uncontrollable ambient light is not used as a main reference for the conventional optical mouse navigation algorithm. Further, a conventional remote controller may be based on the interaction between an infrared (IR) sensor and IR LED beacons. Usually, a display device is implemented with an external sensing bar to detect the motion/control of the conventional remote controller. However, the hardware cost is increased due to the external sensing bar. Further, at least three image sensors in the same conventional optical flow sensor system are needed and employed to detect motion of an unmanned aerial vehicle thereby obtaining the six-axis data of the unmanned aerial vehicle. This, however, results in higher circuit cost.
Therefore one of the objectives of the invention is to provide an optical flow sensing method, optical flow sensor, remote controller device, and a rotatable electronic device, and so on, to solve the above-mentioned problems.
According to embodiments of the invention, an optical flow sensing method is disclosed. The method comprises: using a first image sensor to capture a first image and a second image; using a first directional-invariant filter device upon at least one first block of the first image to process pixel values of the at least one first block of the first image, to generate a first filtered block image; using the first directional-invariant filter device upon at least one first block of the second image to process pixel values of the at least one first block of the second image, to generate a second filtered block image; comparing the first filtered block image with the second filtered block image to calculate a first correlation result; and, estimating a motion vector according to a plurality of first correlation results.
According to the embodiments, an optical flow sensor is further disclosed. The optical flow sensor comprises a first image sensor, a first directional-invariant filter device, and a processor. The first image sensor is configured for capturing a first image and a second image. The first directional-invariant filter device is coupled to the first image sensor, applied for at least one first block of the first image to process pixel values of the at least one first block of the first image, to generate a first filtered block image, and is also applied for at least one first block of the second image to process pixel values of the at least one first block of the second image, to generate a second filtered block image. The processor is coupled to the first directional-invariant filter device and is configured for comparing the first filtered block image with the second filtered block image to calculate a first correlation result and for estimating a motion vector according to a plurality of first correlation results.
According to the embodiments, a remote controller device is further disclosed. The device comprises above-mentioned optical flow sensor, a control switch, and a processing circuit. The processing circuit is configured for detecting a status of the control switch and for outputting or reporting the estimated motion vector when detecting that the control switch is at an ON state.
According to the embodiments, a rotatable electronic device is further disclosed. The rotatable electronic device comprises an optical flow sensor which comprises a first image sensor, a first directional-invariant filter device, and a processor. The first image sensor is configured for capturing a first image and a second image. The first directional-invariant filter device is coupled to the first image sensor, and is applied for at least one first block of the first image to process pixel values of the at least one first block of the first image, to generate a first filtered block image, and also applied for at least one first block of the second image to process pixel values of the at least one first block of the second image, to generate a second filtered block image. The processor is coupled to the first directional-invariant filter device, and is configured for comparing the first filtered block image with the second filtered block image to calculate a first correlation result, and for estimating a motion vector according to a plurality of first correlation results.
According to the embodiments, at least one of the following advantages can be provided. The mechanism of the invention is capable of precisely estimating motion, planar rotation, or behavior of a rotatable electronic device such as an unmanned aerial vehicle, a virtual reality interactive device, or an augmented reality interactive device based on sensing results of merely two image sensors of the same optical flow sensor (or system). For example, this can effectively obtain the six-axis data of an unmanned aerial vehicle. Compared to the conventional optical flow scheme using at least three image sensors to obtain the six-axis data, the hardware costs can be significantly reduced. Further, a remote controller device can be arranged to adaptively generate and output the motion of a user' hand to a display device if it is implemented with the above-mentioned optical flow sensor of the embodiments, so that it is not required for the display device to be used with a sensor bar for sensing the motion of user's hand. To summarize, in the embodiments, novel optical flow sensors and corresponding methods are provided and can be applied into a variety of electronic devices thereby reducing the hardware costs and/or complexity of software computations.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Please refer to
Step 105: Start;
Step 110: Use the image sensor 205 to capture images;
Step 115: Use the directional-invariant filter device 210 upon blocks of the images to generate filtered block images;
Step 120: Calculate correlation result(s) of the filtered block images;
Step 125: Estimate a motion vector according to the correlation result(s);
Step 130: Store the estimated motion vector and the filtered block images; and
Step 135: End.
In practice, the processor 220 is used for controlling the image sensor 205 and global shutter 215 to capture images. The captured images are transmitted to the directional-invariant filter device 210 which is used for processing blocks (or called block images) of the captured images to generate filtered block images. The processor 220 receives the filtered block images to calculate correlation results to obtain motion vector estimation results.
In step 110, the image sensor 205 is used with the global shutter 215 in a global shutter mode in which every pixel in a frame is exposed simultaneously at the same instant in time, compared to a rolling shutter mode in which each row in a frame will be exposed for the same amount of time but the exposing begins at a different point in time. In addition, in the global shutter mode, pixels in each line of a frame are exposed to the same or similar amount of light such as the same environmental light. Unlike the conventional optical mouse enclosed system, the introduced optical flow sensor 200 utilizes the ambient light, which is uncontrollable, and the global shutter 215 is introduced to sense the reflected light.
The image sensor 205, for example, captures a first image and a second image at different timings. In Step 115, the directional-invariant filter device 210 is used for processing pixel values of at least one portion of the first image to generate at least one first filtered portion image and for processing pixel values of at least one portion of the second image to generate at least one second filtered portion image. For instance, the directional-invariant filter device 210 can be arranged to re-assign a corresponding value for each pixel of the at least one portion of the first image to generate at least one first filtered portion image and also to re-assign a corresponding value for each pixel of the at least one portion of the second image to generate at least one second filtered portion image. However, this is not meant to be a limitation, in other examples, the directional-invariant filter device 210 can be arranged to process the at least one portions of images to generate filtered images by interpolation and/or other pixel processing operations. In addition, for example, in the embodiment, the portion of first/second image may be defined as a block or a block image having M×N pixels wherein M and N indicate different or identical integers. That is, the directional-invariant filter device 210 may process at least one block of the first image to re-assign a corresponding value for each pixel of the at least one block of the first image to generate at least one filtered block image and may process at least one block of the second image to re-assign a corresponding value for each pixel of the at least one block of the second image to generate at least one filtered block image. Please refer to
Further, in this embodiment, the directional-invariant filter device 210 is configured for processing blocks at the same spatial positions respectively within two different images. This is not meant to be a limitation. That is, the position of the block of the first image corresponds to that of the block of second image. In other embodiments, the directional-invariant filter device 210 may be used for processing blocks at different spatial positions respectively within two different images if necessary.
V1=2Xo−(Do+Eo);
V2=2Xo−(Bo+Go);
V3=2Xo−(Ao+Ho);
V4=2Xo−(Co+Fo);
V5=(4Xo−(Ao+Co+Fo+Ho))/2;
wherein V1-V5 respectively mean the five results of the five tests mentioned above, and Ao−Ho and Xo respectively indicate the results of the nine pre-processing filters.
The directional-invariant filter device 210 is arranged to use the nine pre-processing filters to generate filter outputs, perform the five tests upon each pixel of the block of the image based on the filter outputs, and then to assign a corresponding value for the each pixel based on the results of five tests. In the embodiment, the directional-invariant filter device 210 is to select a minimum value among the results of five tests as the corresponding value which is assigned to the each pixel. Alternatively, in other embodiments, the directional-invariant filter device 210 maybe arranged to select a maximum value among the results of five tests as a first value, select a minimum value among the results of five tests as a second value, and then calculate the difference between the first and second values. If the absolute value of the calculated difference is smaller than the result V5 of the fifth test, the directional-invariant filter device 210 is arranged to select the value of V5 as the corresponding value which is to be assigned for a pixel. If the absolute value is not smaller than the result V5 of the fifth test, the directional-invariant filter device 210 is arranged to select minimum value among the results of five tests as the corresponding value which is to be assigned for such pixel. After assigning corresponding values for all pixels within a block of the image, the directional-invariant filter device 210 outputs the assigned values as a filtered block image. The filtered block image is transmitted to the processor 220 and may be also stored in a storage device such as a memory (not shown in
In Step 120, the processor 220 is arranged to calculate a correlation result of two filtered block images such as two filtered block images corresponding to two consecutive frames/images. The processor 220 performs the correlation calculation by comparing values of the two filtered block images. For example, the processor 220 may shift one pixel unit for all pixel values of one filtered block image to generate a shifted block image, then calculate the pixel difference between the shifted block image and the other filtered block image, and finally output the pixel difference as the correlation result of the two filtered block image. Alternatively, the processor 220 may multiplies pixels of the shifted block image with corresponding pixels of the other filtered block image respectively to generate multiplication results, and sum all the multiplication results as the correlation result of the two filtered block image.
Further, in other embodiments, the processor 220 can perform the correlation calculation by shifting a sub pixel unit for each pixel value of the filtered block images. For example, processor 220 can shift ½, ¼, ⅛, 1/16, or 1/32 pixel unit by performing interpolations upon the filtered block images. For instance, if each filtered block image has 200×200 pixels, the processor 220 may calculate and generate an interpolated pixel between every two adjacent pixels to achieve the shifting of ½ pixel unit.
Alternatively, in other embodiments, the processor 220 can perform interpolation upon the correlation results calculated from the filtered block images, instead of performing interpolation upon the filtered block images.
After calculating the correlation result(s), in Step 125, the processor 220 is arranged to estimate a motion to obtain a motion vector according to the correlation result(s). The processor 220 may transmit and store the estimated motion vector in the storage device of Step 130. Then, in next timing, the processor 220 may be able to read back the estimated motion vector at the previous timing to assist estimation/prediction of motion, to more accurately estimate the motion vector or reduce computation load. That is, an estimated motion vector at a previous timing can be further used as a reference to calculate a correlation result of another set of filtered block images at a later timing.
Further, in an embodiment, an external control switch may be installed on an electronic device implemented with the optical flow sensor 200, and the electronic device may be configured for detecting a status of the external control switch and the electronic device is arranged for outputting estimated motion vector(s) when the electronic device detects that the external control switch is at an ON state. For example, if a user controls and switches the external control switch to the ON state, the electronic device will be arranged for outputting the estimated motion vector(s). Please refer to
Further, it should be noted that the optical flow sensor 200 mentioned above can be applied in a variety of electronic devices and products such as Unmanned Aerial Vehicle (UAV), Unmanned Aircraft System (UAS), Unmanned Underwater vehicle (UUV), Dynamic Remotely Operated Navigation Equipment (DRONE), remote controller device, virtual reality (VR) interactive device, augmented reality (AR) interactive device, and so on. For example, if the optical flow sensor 200 is applied into an unmanned aerial vehicle device having a Gyro sensor, a barometer, and an acoustic sensor, the unmanned aerial vehicle device can be arranged to obtain proportional integral derivate (PID) information from at least one of the Gyro sensor, barometer, and the acoustic sensor after obtaining an estimated motion vector outputted from the optical flow sensor 200. Then, the unmanned aerial vehicle device can calibrate the estimated motion vector by referring to the obtained proportional integral derivate information, and thus compensate motion of the unmanned aerial vehicle device based on the motion vector which has been calibrated. This can effectively improve the stability of motion of the unmanned aerial vehicle device. For example, the estimated motion vectors of the unmanned aerial vehicle device can be compensated based on PID information, and the compensated translations can be represented by the following equations:
wherein (Vx, Vy) means Optical flow motion vectors, (Wx, Wy) means angular velocities of X-axis and Y-axis, f and Z denotes the focal length and current distance to the scene, and (Tx, Ty) means compensated translations.
Step 705: Start;
Step 710: Receive or obtain motion data such as a motion vector from the optical flow sensor 200;
Step 715: Receive or obtain proportional integral derivate (PID) information from at least one of the Gyro sensor, barometer, and the acoustic sensor;
Step 720: Compensate the received motion data or motion vector based on the received PID information;
Step 725: Compensate or calibrate motion or behavior of the unmanned aerial vehicle device by using the compensated motion data or motion vector; and
Step 730: End.
Please refer to
Step 805: Start;
Step 810A: Use the image sensor 905A to capture images;
Step 810B: Use the image sensor 905B to capture images;
Step 815A: Use the directional-invariant filter device 910A upon blocks of the images to generate filtered block images;
Step 815B: Use the directional-invariant filter device 910B upon blocks of the images to generate filtered block images;
Step 820A: Calculate correlation result(s) of the filtered block images;
Step 820B: Calculate correlation result(s) of the filtered block images;
Step 825A: Estimate a motion vector according to the correlation result(s);
Step 825B: Estimate a motion vector according to the correlation result(s);
Step 830A: Store the estimated motion vector and the filtered block images;
Step 830B: Store the estimated motion vector and the filtered block images;
Step 835: Perform planar rotation estimation based on motion vector of Step 830A and motion vector of Step 830B; and
Step 840: End.
The operations and functions of image sensor 905A/905B and directional-invariant filter device 910A/910B are similar to those of image sensor 205 and directional-invariant filter device 210, and are not detailed for brevity.
In Step 835, the processor 220 of sensor 900 is used for performing planar rotation estimation based on the estimated motion vector of Step 830A and the estimated motion vector of Step 830B. in practice, the two image sensors 905A and 905B of sensor 900 can be configured or positioned at different positions on the same plane. For example, the two image sensors 905A and 905B positioned at different positions on the same plane can be used for estimating the planar rotation angle of motion of an unmanned aerial vehicle device or DRONE.
As shown in the center of
wherein Δx indicates x′-x, and Δy means y′-y. Thus, as shown in the right of
θ=arctan 2(Δy,r+Δx)
θ=arctan 2(Δy2−Δy1,r+Δx2−Δx1)
Further, in other embodiments, if the sensor 900 is applied into an unmanned aerial vehicle device or DRONE and the two image sensors 905A & 905B are positioned at different positions on different planes to respectively estimate motion vectors from different portions of captured images, the sensor 900 is capable of estimating the six-axis data of motion and behavior of the unmanned aerial vehicle device or DRONE based on merely two image sensors. The angle between the different planes can be 90 degrees or others. Please refer to
For example, the image sensor 905A positioned on the plane 1100A is arranged to capture two portions of images 1101 and 1102 to estimate the motion of the portion of image 1101 and motion of the portion of image 1102 thereby generating two motion vectors which can be represented by (Δx1,Δy1) and (Δx2,Δy2). For calculating/estimating each motion vector, the image sensor 905A can perform calculation based on the steps in
ƒφ[(Δx1,Δy1),(Δx2,Δy2)]=Δφ
ƒψ[(Δx3,Δy3),(Δx4,Δy4)]=Δψ
ƒθ[(Δy3,Δy1),(Δy4,Δy2)]=Δθ
wherein each of ƒφ[ ], ƒψ[ ], and ƒθ[ ] is a function for used for generating a corresponding rotation angle based on two different vectors, and similarly such function is based on the same assumption of the above-mentioned equations used for estimating the planar rotation angle and is not detailed for brevity.
Compared to the conventional mechanism adopting at least three separate image sensors positioned on three different planes to estimate the six-axis data of the DRONE, only two separate image sensors are required for the DRONE implemented with the optical flow sensor 900 to detect motions of images thereby estimating the six-axis data. The hardware costs can be significantly reduced.
In order to more clearly describe the estimation of motion of an unmanned aerial vehicle device based on the scheme of the invention,
As shown in
As shown in
It should be noted that all above-mentioned implementations of unmanned aerial vehicle devices are user for illustrative purposes and are not meant to be limitations of the invention. In other embodiments, the optical flow sensor 200 or 900 can be applied into a variety of electronic devices including other type unmanned aerial vehicle devices and/or AR/VR interactive devices.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
8077922 | Barrows | Dec 2011 | B1 |
10228691 | Pounds | Mar 2019 | B1 |
10269257 | Gohl | Apr 2019 | B1 |
20090225300 | Barrows | Sep 2009 | A1 |
20110103657 | Kang | May 2011 | A1 |
20110128379 | Lee | Jun 2011 | A1 |
20110134242 | Loubser | Jun 2011 | A1 |
20170109864 | Ohba | Apr 2017 | A1 |
20170180729 | Wu | Jun 2017 | A1 |
20170188046 | Wu | Jun 2017 | A1 |
20170323459 | Ermilios | Nov 2017 | A1 |
Number | Date | Country | |
---|---|---|---|
20180348032 A1 | Dec 2018 | US |