This invention relates to digital image stitching and in particular to an image stitching method for generating a 360 degree panoramic image from a sequence of images.
Digital photography is becoming more popular today as digital cameras and scanners are becoming widely available. Digital images can be created either by capturing a scene using a digital camera or digitizing a traditional film-based photograph using a scanner. One particular advantage of digital photography over traditional film-based photography is that digital images can be easily manipulated or edited for better presentation. Digital photography can also be readily distributed over the Internet.
When a photographer captures a scene using a camera, the desired field of view may be larger than the normal field of view of the camera. Digital photography allows a panoramic image to be produced without the need of purchasing special equipment such as a panoramic camera or fisheye lenses. For example, a photographer with a digital camera may capture a series of digital pictures of a scene by rotating the camera and taking pictures in a sequence of different directions. The captured images may then be stitched together to produce a panoramic picture of the scene. Similarly, film-based photographs can be digitized, and the panoramic picture can be composed by stitching together the digitized images. Presently, digital image programs are available for stitching multiple digital images together to form a panoramic picture. Exemplary programs include Ulead Cool 360™, Live Picture PhotoVista™, and MGI PhotoSuite III™.
Typically a conventional image program stitches images by matching corresponding features on two source images and rotating one source image so the corresponding features overlap. For example, a 360 degree panoramic image is constructed from a sequence of many images where the last image is stitched to the first image to complete the 360 degree view. However, errors (e.g., matching and motion estimation errors) cause a gap between the last image and the first image that must be compensated so they can be stitched together. Therefore, there is a need for a method to compensate these errors in a reasonable way so the last image can be stitched to the first image.
In one embodiment, a method for creating a 360 degree panoramic image from multiple images includes (1) computing a gross rotation error ΔR between a first image and a calculated first image rotated to be stitched to a last image, and (2) spreading the gross rotation error ΔR to each pixel of the panoramic image. In one embodiment, spreading the gross rotation error ΔR includes (1) computing a rotation angle θ0 and rotational axis n0 from the gross rotational error ΔR, (2) determining an angle α of each pixel, and (3) determining a compensation matrix Rc for each pixel using the following formula:
In one embodiment, spreading the gross rotation error ΔR further includes (4) tracing a pixel in a column on the panoramic image to a camera optical center of the images to form a first ray, (5) determining a second ray originating from the camera optical center that would be rotated by the compensation matrix Rc to coincide with the first ray, (6) tracing the second ray to a second pixel on one of the images, and (7) painting the first pixel with color values of the second pixel.
In one embodiment of the invention, a computer generates a 360 degree panoramic image by determining the focal length of the camera, matching feature points between adjacent images, and using the focal length and the feature points to determine the positions of the adjacent images around a fixed camera optical center. The computer assumes that the adjacent images are taken by a camera rotated from the fixed camera optical center. After the images are arranged about the camera optical center, their pixels can be projected onto a cylindrical surface (or vice versa) to generate the 360 degree panoramic image. For additional details regarding determining the focal length, matching feature points, and determining the position of adjacent images, please see U.S. patent application Ser. No. 09/665,917, filed Sep. 20, 2001, which is incorporated by reference in its entirety.
The relative rotation matrices between adjacent images P[0] and P[1], P[1] and P[2], . . . , and P[8] and P[9] can be defined as RH[0], RH[1], . . . , and RH[9], where “R” means “rotation” and “H” means the rotation is a relative motion in the horizontal direction. The absolute rotation matrices for images P[0], P[1], . . . , and P[9] can be defined as R[0], R[1 ], . . . , and R[9]. Accordingly,
If there is no matching or motion estimation error, R′[0] should be an identity matrix like R[0]. In reality, R′[0] is not an identity matrix because of the matching and motion estimation errors. For images P[0] to P[9] to be seamlessly stitched, the computer must make R′[0]=R[0]=I.
If R′[0]*ΔR=R[0]=I, then ΔR=(R′[0])−1. ΔR is defined as the gross rotation error for matching and motion estimation. ΔR is the rotation necessary to rotate a calculated first image P′[0] to overlap the original first image P[0], where the calculated first image P′[0] is the first image P[0] rotated to be stitched to the last image P[9]. In other words, this is the rotation matrix needed to rotate the original first image P[0] to match the last image P[9] so they can be stitched together to from the 360 degree panoramic image. There is a need for an algorithm to spread the gross error ΔR in the stitched image so the last image P[9] can be stitched to the first image P[0].
In action 204, the computer calculates the gross rotational matrix ΔR from the relative rotation matrices using the following formula:
ΔR=(R′[0])−1.
ΔR is the relative rotation matrix between the original first image P[0] and the calculated first image P′[0] that can be stitched to the last image P[9].
In action 206, the computer calculates the rotational axis form of the gross rotational matrix ΔR. The rotational axis form is defined by a rotation axis and a rotational angle as follows:
As shown in
As shown in
In action 208, the computer computes the size of cylindrical surface 102 (i.e., the size of the final 360 panoramic image). The size of cylindrical surface 102 is determined by its radius, which can be arbitrary. In one embodiment, the user sets the radius by selecting (1) the average focal length, (2) ½ of the average focal length, and (3) ¼ of the average focal length.
In action 210, the computer selects a starting column Q (
In action 212, the computer selects a starting pixel Pf (
In action 214, the computer compensates current pixel Pf (i.e., the selected pixel) with a rotational compensation matrix Rc matrix. The rotational compensation matrix Rc for pixel Pf can be used for all the pixels in the same column (e.g., all the pixels in selected column Q) to reduce the computational cost. Rotational compensation matrix Rc is the gross rotational matrix ΔR spread out among each of the columns on cylindrical surface 102. As unit directional vector n0 and rotational angle θ0 are calculated in action 206, then the compensation rotation matrix Rc can be defined as:
Angle α (
In action 216, the computer determines if the current pixel Pf is the last pixel in the current column Q. If it is not, action 216 is followed by action 218. If the current pixel Pf is the last pixel in the current column Q, then action 216 is followed by action 220.
In action 218, the computer selects a next pixel in the current column. Action 218 is followed by action 214 and the above actions cycle until all the pixels in the current column have been compensated.
In action 220, the computer determines if the current column in the last column on cylindrical surface 102. If it is not, action 220 is followed by action 222. If the current column is the last column, then action 220 is followed by action 224, which ends method 200.
In action 222, the computer selects a next column on cylindrical surface 102. Action 222 is followed by action 212 and the above actions cycle until all the pixels in all the columns have been compensated.
One embodiment of method 200 implemented in pseudo code is provided in the attached index.
Various other adaptations and combinations of features of the embodiments disclosed are within the scope of the invention. Numerous embodiments are encompassed by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4090174 | Van Voorhis | May 1978 | A |
4183162 | Applebaum et al. | Jan 1980 | A |
4197004 | Hurlbut | Apr 1980 | A |
5450604 | Davies | Sep 1995 | A |
5742710 | Hsu et al. | Apr 1998 | A |
5790206 | Ju | Aug 1998 | A |
5987164 | Szeliski et al. | Nov 1999 | A |
6078701 | Hsu et al. | Jun 2000 | A |
6097854 | Szeliski et al. | Aug 2000 | A |
6128108 | Teo | Oct 2000 | A |
6157747 | Szeliski et al. | Dec 2000 | A |
6192156 | Moorby | Feb 2001 | B1 |
6249616 | Hashimoto | Jun 2001 | B1 |
6359617 | Xiong | Mar 2002 | B1 |
6385349 | Teo | May 2002 | B1 |
6393162 | Higurashi | May 2002 | B1 |
6404516 | Edgar | Jun 2002 | B1 |
6434265 | Xiong et al. | Aug 2002 | B1 |
6456323 | Mancuso et al. | Sep 2002 | B1 |
6507665 | Cahill et al. | Jan 2003 | B1 |
6532037 | Shimura | Mar 2003 | B1 |
6587597 | Nakao et al. | Jul 2003 | B1 |
6643413 | Shum et al. | Nov 2003 | B1 |
6646655 | Brandt et al. | Nov 2003 | B1 |
6912325 | Rombola et al. | Jun 2005 | B2 |
6978051 | Edwards | Dec 2005 | B2 |
7085435 | Takiguchi et al. | Aug 2006 | B2 |
7194112 | Chen et al. | Mar 2007 | B2 |
20020006217 | Rubbert et al. | Jan 2002 | A1 |
20020181802 | Peterson | Dec 2002 | A1 |
20020191865 | Yamaguchi et al. | Dec 2002 | A1 |
20030197780 | Iwaki et al. | Oct 2003 | A1 |
20030235344 | Kang et al. | Dec 2003 | A1 |
20050089244 | Jin et al. | Apr 2005 | A1 |
Number | Date | Country |
---|---|---|
WO-03091948 | Nov 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20040042685 A1 | Mar 2004 | US |