This is a U.S. National Phase Application under 35 U.S.C. 371 of International Application PCT/JP2007/068468, filed on Sep. 22, 2007, which claims the priority of Japanese Application No. 2006-299208, filed Nov. 2, 2006, the entire content of both Applications are hereby incorporated by reference.
The present invention relates to a wide-angle image acquiring method an which two cameras are used to photograph an object so that a first image and a second image are acquired, whereby a wide-angle image of the object is acquired by the first image and the second image, and to a wide-angle stereo camera device.
In recent years, “stereo cameras” have been widely used in various fields, such as robot industries, or the like, due to the fact that the stereo cameras can measure three-dimensional space, and can help to research or identify the object. The stereo camera is structured of two cameras (lenses), each of which is placed at two separated positions to photograph the same object at the same time, whereby the stereo camera can acquire information regarding the depth direction, that is, an image can be acquired from which it is possible to stereoscopically acquire a space, as when a person directly views the object.
Since both cameras are positioned with each other in parallel against the horizontal line, said stereo camera can acquire two advantages.
A first advantage is that when a distance in the depth direction is to be measured, the fundamental principle of triangulation is used, if said distance is represented by Z, said distance Z can be easily acquired by Formula (10), shown below. However, in order to satisfy the relationship shown by Formula (10), both cameras are necessary to be placed parallel to each other, and if either camera is not placed in parallel, very complicated calculations are necessary to be conducted.
Z=b/d×f (10)
In Formula (10), “b” represents the distance between both cameras, “d” represents the positional disagreement on an imaging surface acquired by each camera with respect to the same object, and “f” represents the focal distance.
A second advantage is that within the two separate images photographed by said two cameras, any point on the image photographed by one camera can be easily identified as a corresponding point on the image photographed by the other camera. That is, on the stereo camera, an observing point, which is on the image photographed by one camera, is necessary to be searched on the image photographed by the other camera, as an observing point on the image photographed by the other camera, corresponding to that of the one camera, and a line for said searching is termed as an “Epipolar line”. If both cameras are arranged in parallel with each other, said Epipolar lines on one image and the other image are made consistent with each other in a parallel relationship, so that a point, corresponding to the observing point on the image photographed by one camera, is easily searched on the image photographed by the other camera.
However, from a physical point of view, it is very difficult to arrange two cameras to be truly parallel, whereby in recent years, the images photographed by two cameras are processed so that paralleling is established (for example, see Patent Documents 1 and 2).
Concerning the above-described stereo camera structured of two separated cameras, each camera represents a pin-hole camera, to photograph an object as an image on a plain surface, whereby if a stereo camera is structured of cameras each having a wide-angle lens, such as a fish-eye lens or the like, though said cameras are possible to acquire a wide-angle image, from which a space can be three-dimensionally acquired, the above described first and second advantages are not sufficiently realized.
That is, concerning the wide-angle lens, such as the fish-eye lens, whose curvature of an object plane is very great, exhibiting a field angle (represented by θ) of 180°, whereby as shown in
An object of the present invention is to offer a method to acquire a wide-angle image and a wide-angle stereo camera device, which can acquire a wide-angle image, by which the three-dimensional space is acquirable, while maintaining the above-described first and second advantages.
The above problems can be solved by the structures listed below.
Structure 1. In a wide-angle image acquiring method, which photographs an object while using a first camera and a second camera, to acquire a first image and a second image, and which acquires a wide-angle image of the object while using the first image and the second image, said wide-angle image acquiring method is characterized in that:
a step of paralleling an Epipolar line of the first image with an Epipolar line of the second image, so that said Epipolar lines are made consistent with each other;
a step of correcting distortions of the first image and the second image, each projected on a cylindrical imaging surface;
a step of searching for a point which corresponds to an observing point on the first image, on the Epipolar line of the second image; and
a step of reconstructing a three-dimensional figure of the object, after acquiring distances in a depth direction from the first and the second cameras to the object, based on a searched result of the searching step.
Structure 2. The wide-angle image acquiring method of Structure 1 is characterized in that, in the step of reconstructing the three-dimensional figure of the object, when the distance between the first camera and the object is represented by R1 in a cylindrical coordinate system, said distance R1 is calculated by the formula shown below.
R1=b×f/d′
(In which formula, “b” represents the distance between the first camera and the second camera, “f” represents the focal length of the first camera and the second camera, and “d′” represents the positional disagreement between the first image and the second image.)
Structure 3. The wide-angle image acquiring method of Structure 1, is characterized in that, in the step of reconstructing the three-dimensional figure of the object, when the distance between the first camera and the object is represented by Z2 in an orthogonal coordinate system, said distance Z2 is calculated by the formula shown below.
Z2=R1×sin θ1
(In which formula, “θ1” represents an angle, viewed from the first camera.
Structure 4. In a wide-angle stereo camera device, which photographs an object to acquire a first image and a second image, while using a first camera and a second camera, and which acquires a wide-angle image of the object, while using the first image and the second image, the wide-angle stereo camera device is characterized in that:
a means for paralleling an Epipolar line of the first image with an Epipolar line of the second image, so that said Epipolar lines are made consistent with each other;
a means for correcting distortions of the first image and the second image, each projected onto an cylindrical imaging surface;
a means for searching a point which is corresponding to an observing point of the first image on the Epipolar line of the second image; and
a means for reconstructing a three-dimensional figure of the object, after acquiring distances in the depth direction from the first and the second cameras toward the object, based on a searched result of the searching means.
Structure 5. The wide-angle stereo camera device of Structure 4, is characterized in that the means for reconstructing the three-dimensional figure of the object, when the distance between the first camera and the object is represented by R1 in the cylindrical coordinate system, said distance R1 is calculated by the formula shown below.
R1=b×f/d′
(In which formula, “b” represents the distance between the first camera and the second camera, “f” represents the focal length of the first camera and the second camera, and “d′” represents the positional disagreement between the first image and the second image.)
Structure 6. The wide-angle image acquiring method of Structure 4 is characterized in that: in the means for reconstructing the three-dimensional figure of the object, when the distance between the first camera and the object is represented by Z2 in an orthogonal coordinate system, said distance Z2 is calculated by the formula shown below.
Z2=R1×sin θ1
(In which formula, “θ1” represents the angle viewed from the first camera.)
Based on the present invention, the Epipolar line of the first image is made consistent with the Epipolar line of the second image in the step of paralleling, whereby said first and second advantages, can be maintained, as detailed in the previous background description, and further, any distortions of the first image and the second image are corrected in the step of correcting, so that, even though the characteristic is transformed, the central portions of the first and the second image are prevented from being excessively reduced, while the side portions are prevented from being excessively enlarged (see
The best embodiment to realize the present invention will now be detailed, while referring to the drawings. Various limitations, which are technically preferable for the conduction of the present invention, are applied on the embodiments described below, however the scope of the present invention is not limited to the embodiments below nor to the relevant drawings.
Firstly, a wide-angle stereo camera in the present invention will now be detailed. In
First camera 10 and second camera 20, incorporating the same structure between each other, are arranged in parallel, or arranged nearly in parallel, with respect to the horizontal line. First camera 10 has first lens 11, while second camera has second lens 21. First and second lenses 11 and 21 represent well-known wide-angle lenses, such as fish eye lenses, whereby first and second cameras 10 and 20 can capture wide-angle images, while employing said first and second lenses 11 and 21.
Image processing device 30 is structured of a general purpose CPU (which is a Central Processing Unit), RAM (which is a Random Access Memory), ROM (which is a Read Only Memory), and the like electronic devices. Image processing device 30, allows CPU to conduct the calculating process in the RAM, serving as a working area, in accordance with processing programs, stored in ROM, and in more detail, after receiving the images, captured by first and second cameras 10 and 20, image processing device 30 conducts image processing on the images, so that image processing device 30 generates a corrected image, or an image to exhibit information concerning the depth direction (see the later description).
[First Embodiment]
Subsequently, the first embodiment of the “wide-angle image acquiring method” in the present invention will now be detailed.
As shown in
Since first image 41 and second image 42 were captured by first and second lenses 11 and 21, each structured of a wide-angle lens exhibiting the characteristic of fθ, first image 41 and second image 42 are represented as images, being projected onto hemisphere image capturing surfaces 12 and 22 (see the dotted line in
After the process of the paralleling step has been completed, image processing device 30, serving as the correcting means, transforms the coordinates of first and second images 41 and 42 to the coordinates of cylindrical image capturing surfaces 13 and 23, so that any distortions of first and second images 41 and 42 are corrected (see the solid line in
In more detail, when the coordinates of first and second images 41 and 42, both being before the correcting step, are represented by [x, y], while the coordinates of first and second images 41 and 42, both being after the correcting step, are represented by [x′, y′], coordinates [x, y] are transformed to coordinates [x′, y′], based on Formula (1), shown below. As a result, first and second images 41 and 42, having been projected on hemisphere surfaces 12 and 22, can be projected onto cylindrical image capturing surfaces 13 and 23.
After the correcting step has been completed, in order to acquire the corresponding relationship between corrected first image 41 and corrected second image 42, image processing device 30, serving as the researching means, researches a point which corresponds to the observing point on Epipolar line EP of first image 41, based on corrected first image 41, on Epipolar line EP of corrected second image 42 (which is the researching step). As a result, the amount of positional disagreement between corrected first image 41 and corrected second image 42, which is a value corresponding to “d” shown in Formula (10), can be acquired.
After the process of the searching step has been completed, image processing device 30, serving as the reconstruction means, acquires the distance concerning each observing point on corrected image 41, based on the processed results (which are the searched results) of the researching step, so that a three-dimensional figure of object 40 is reconstructed by both the image capturing result of first camera 10, acquired in the image capturing step, and the image capturing result (being a two dimensional figure of first image 41) of first camera 10 (which is the reconstruction step). As a result, a conclusive wide-angle image can be generated.
In more detail, the three-dimensional coordinates (which are the cylindrical coordinates) of object 40, viewing from first camera 10, are represented by [X1, θ1, R1] (see
X1=x′/f×R1 (2a)
θ1=y′/f (2b)
R1=b×f/d′ (2c)
In which formulas, “x′” and “y′” represent corrected coordinate values, “f” represents the focal length of both first and second lenses 11 and 12 of first and second cameras 10 and 20, “b” represents the distance between first camera 10 and second camera 20, and “d′” represents the amount of positional disagreement between corrected first image 41 and corrected second image 42.
When said cylindrical coordinates [X1, θ1, R1] are to be shown by the normal orthogonal coordinates (see
X2=X1 (3a)
Y2=R1×cos θ1 (3b)
Z2=R1×sin θ1 (3c)
In addition, symbol “θ1” is a value which is to be acquired by Formula (2b), and represents the angle to object 40, viewed from first camera 10 in the cylindrical coordinate system.
In the above-described first embodiment, since the paralleling step makes Epipolar lines EP of first and second images 41 and 42 to be consistent with each other the first and second advantages can be maintained, as described in the previous background description. Further, since the correcting step corrects any distortions of first and second images 41 and 42, even though the characteristic is transformed, the central portions of first and second images 41 and 42 are prevented from being excessively reduced, while the side portions are prevented from being excessively enlarged (see
[Second Embodiment]
The second embodiment of “the wide-angle image acquiring method” of the present invention will now be detailed.
The second embodiment differs from the first embodiment, concerning some points detailed below, however, the second embodiment is equal to the first embodiment, concerning other than said points.
Wide-angle stereo camera device 1, relating to the second embodiment, employs well-known standard lenses as first and second lenses 11 and 21, instead of the wide-angle lenses.
In the wide-angle image acquiring method relating to the second embodiment, as the same way as in the case of the first embodiment, the image capturing step, the paralleling step, the correcting step, the searching step and the reconstruction step are conducted, however, as shown in
In the correcting step, coordinates [x, y] of not-yet corrected first and second images 43 and 44 are transformed into coordinates [x″, y″], based on Formula (4) which is shown below. As a result, first and second images 43 and 44, having been projected on flat image capturing surface 14 and 24, can be projected onto cylindrical image capturing surfaces 13 and 23 (see the solid lines in
In the reconstruction step, when the three-dimensional coordinates (being the cylindrical coordinates) of object 40 is represented as [X3, θ3, R3], viewed from first camera 10 (see
X3=x″/f×R3 (5a)
θ3=y″/f×R3 (5b)
R3=b×f/d″ (5c)
Symbols “x″” and “y″” represent the coordinate values after the correcting step, while “d″” represents the amount of positional disagreement between first image 43 and second image 44.
When said cylindrical coordinates [X3, θ3, R3] are to be represented by normally used orthogonal coordinates [X4, Y4, Z4] (see
X4=X3 (6a)
Y4=R3×cos θ3 (6b)
Z4=R3×sin θ3 (6c)
In addition, “θ3” shown in Formulas (6b) and (6c) is a value which is acquired by Formula (5b), and is the angle to object 40, viewed from first camera 10 in the cylindrical coordinate system.
In the above-described second embodiment, since the paralleling step makes Epipolar lines EP of first and second images 43 and 44 to be consistent with each other, the first and second advantages can be maintained, as described in the previous background description. Further, since the correcting step corrects any distortions of first and second images 43 and 44, even though the characteristic is transformed, the central portions of first and second images 43 and 44 are prevented from being excessively reduced, while the side portions are prevented from being excessively enlarged. Accordingly, the second embodiment can acquire a wide angle image, in which the space can be three-dimensionally realized, while the above-described first and second advantages are maintained.
In addition, the present invention is not limited to the first and second embodiments, and various improvements and design changes can be appropriately conducted within the scope of this invention, as long as they do not deviate from the contents of the present invention.
Regarding items of said improvements and design changes, since optical systems, such as first and second lenses 11 and 21, are employed in first and second cameras 10, and 20, distortion is inevitably generated, so that perfect characteristic of fθ, and perfect characteristic of f tan θ cannot be realized, whereby, in the correcting step, the well-known correction for the distortion may be conducted, to be added to the projection having been conducted onto the cylindrical image capturing surface, and further, first and second images 41-44 may be enlarged or reduced by the well-known methods.
Number | Date | Country | Kind |
---|---|---|---|
2006-299208 | Nov 2006 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2007/068468 | 9/22/2007 | WO | 00 | 4/23/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2008/053649 | 5/8/2008 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5986668 | Szeliski et al. | Nov 1999 | A |
6011863 | Roy | Jan 2000 | A |
6175648 | Ayache et al. | Jan 2001 | B1 |
6608923 | Zhang et al. | Aug 2003 | B1 |
6671399 | Berestov | Dec 2003 | B1 |
6674892 | Melen | Jan 2004 | B1 |
6677981 | Mancuso et al. | Jan 2004 | B1 |
7103212 | Hager et al. | Sep 2006 | B2 |
7573491 | Hartkop et al. | Aug 2009 | B2 |
20030156751 | Lee et al. | Aug 2003 | A1 |
20050219693 | Hartkop et al. | Oct 2005 | A1 |
20060018509 | Miyoshi et al. | Jan 2006 | A1 |
20060029256 | Miyoshi et al. | Feb 2006 | A1 |
20060193509 | Criminisi et al. | Aug 2006 | A1 |
20080058593 | Gu et al. | Mar 2008 | A1 |
20090167843 | Izzat et al. | Jul 2009 | A1 |
20100329543 | Li et al. | Dec 2010 | A1 |
Number | Date | Country |
---|---|---|
11-083530 | Mar 1999 | JP |
2000-121319 | Apr 2000 | JP |
2001141422 | May 2001 | JP |
2002-152776 | May 2002 | JP |
2002-359838 | Dec 2002 | JP |
2005-293038 | Oct 2005 | JP |
2007295028 | Nov 2007 | JP |
Number | Date | Country | |
---|---|---|---|
20100091090 A1 | Apr 2010 | US |