This invention relates to a projection device for three-dimensional measurement, and to a three-dimensional measurement system. More specifically, this invention relates to a three-dimensional measurement system that can automatically measure a wide area using a projection device for projecting a target pattern for three-dimensional measurement and a photographed image including the projected pattern.
In conventional non-contact three-dimensional measurement, a relatively large-sized apparatus called “non-contact three-dimensional measurement machine” incorporating a light pattern projector and a CCD camera is used to measure small areas, targets affixed to each small area are measured by a photogrammetric technique, and the small areas are integrated based on the coordinate points of the targets into a wide area.
In case where only images from a digital camera or the like are used for three-dimensional measurement, a stereo pair is set, orientation of two or more images is determined, and a measurement position is set manually or semi-automatically.
To measure a wide area, a large-sized non-contact three-dimensional measurement machine is used to measure a large number of small areas, and a photogrammetric technique is used to photograph targets for connecting images affixed to each small area with a camera, to measure the target points three-dimensionally with high accuracy, and to integrate the camera coordinate system and the three-dimensional coordinate systems (such as global coordinate systems) of the targets in each area measured by the three-dimensional measurement machine to measure an entire wide area.
However, this technique is complicated since separate measurement devices are required to measure the small areas and the wide area, and cannot be automated through the entire three-dimensional measurement. In particular, in case of integrating a large number of small areas over an extended area with high accuracy, the reduced measurement range of each area results in a huge number of measurement areas, which in turn results in complicated and inefficient work. For example, a mere measurement of a side surface of a car requires 100 or more small areas or cuts. Thus, even if each operation is simple, the entire operation is ineffective, spending time and efforts.
The object of this invention is to improve the efficiency of and promote the automation of non-contact three-dimensional measurement over a wide area utilizing a projection device for projecting a target pattern.
In order to achieve the above object, a projection device for three-dimensional measurement 80 according to the invention comprises, as shown in
Here, the measurement points include orientation points, and the measurement patterns include orientation patterns. Three-dimensional measurement may be performed based on either absolute coordinates or relative coordinates. The term “displacement” means displacement from measurement points which would be obtained when a surface of a measuring object is projected onto a plane perpendicular to the projection light. The phrase “the measurement points are changed” means changing the type (such as grid intersection, small circle, retro target and color-coded target), the position, the color, the dimension, etc. of the measurement points. The phrase “based on the displacement, the measurement points are increased, deleted or changed” typically means increasing the measurement points where displacement of the measurement points is large. However, the phrase can also mean various operations, such as increasing the measurement points where a characteristic point such as a corner, a peak or a saddle point of a concave-convex, etc. is found, moving a measurement point near a characteristic point to the characteristic point, and deleting an inaccurate point found as a result of orientation or stereo matching. The first measurement pattern may be formed into the second measurement pattern more than once, and however many times as necessary. Accordingly, the measurement pattern may be projected and detected however many times as necessary. The pattern projection control section, the pattern detection section and the pattern forming section are typically implemented in a computer, and may be constituted integrally with or separately from the projection section.
With this constitution, the measurement pattern can be optimized according to the shape, etc. of the measuring object, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes from projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
A projection device for three-dimensional measurement according to the invention comprises, as shown in
Here, the first measurement pattern may be changed into the third measurement pattern more than once, and however many times as necessary. Accordingly, the measurement pattern may be projected and detected however many times as necessary. The pattern selection section are typically implemented in a personal computer, and the pattern storage section may be implemented in a storage device disposed internally or externally to the personal computer. The pattern selection section and the pattern storage section may be constituted integrally with or separately from the projection section.
With this constitution, the measurement pattern can be optimized according to the shape, etc. of the measuring object, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes from projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
The projection device for three-dimensional measurement as recited above according to the invention may further comprise, as shown in
A three-dimensional measurement system 100 according to the invention may comprise, as shown in
A three-dimensional measurement system according to the invention may comprise, as shown in
A calculation processing section 49, according to the invention, of a projection device for three-dimensional measurement having a projection section for projecting a measurement pattern onto a measuring object and detecting a predetermined data from a photographed image of the measurement pattern projected onto the measuring object, may comprise, as shown in
With this constitution, the measurement pattern can be optimized according to the shape, etc. of the measuring object, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes from projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
In order to achieve the above object, a projection device for three-dimensional measurement 80 according to the invention comprises, as shown in
Here, the measurement patterns include orientation patterns. Three-dimensional measurement may be performed based on either absolute coordinates or relative coordinates. The position detection pattern typically includes a retro target or a template pattern. However, the position detection pattern is not limited thereto, but may be a grid pattern or a dot pattern that allows identification of the position. The color code pattern typically includes a pattern having plural rectangular unit areas arranged adjacently. However, the color code pattern is not limited thereto, but may be a pattern having plural colored retro targets. The pattern may include a single unit area with different colors. The pattern projection control section, the pattern detection section and the pattern forming section are typically implemented in a personal computer, and may be constructed separately from the projection section.
With this constitution, identification of respective color-coded marks can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
A projection device for three-dimensional measurement according to the invention comprises, as shown in
Here, the pattern selection section may typically be implemented in a personal computer, and the pattern storage section may be implemented in a storage device disposed internally or externally to the personal computer. The pattern selection section and the pattern storage section may be constructed separately from the projection section. With this constitution, identification of respective color-coded marks can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
The projection device for three-dimensional measurement as recited above according to the invention may further comprise, as shown in
A calculation processing section 49, according to the invention, of a projection device for three-dimensional measurement having a projection section for projecting a measurement pattern onto a measuring object and detecting a predetermined data from a photographed image of the measurement pattern projected onto the measuring object comprises as shown in
With this constitution, identification of respective color-coded marks can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
The invention can improve the efficiency of and promote the automation of non-contact three-dimensional measurement over a wide area utilizing a projection device for projecting a target pattern.
FIGS. 18A1, 18A2, 18B1 and 18B2 (
The basic Japanese Patent Applications No. 2005-289332 filed on Sep. 30, 2005 and No. 2005-289333 filed on Sep. 30, 2005 are hereby incorporated in their entirety by reference into the present application.
This invention will become more fully understood from the detailed description given hereinbelow. The other applicable fields will become apparent with reference to the detailed description given hereinbelow. However, the detailed description and the specific embodiment are illustrated of desired embodiments of this invention and are described only for the purpose of explanation. Various changes and modifications will be apparent to those ordinary skilled in the art on the basis of the detailed description.
The applicant has no intention to give to public any disclosed embodiments. Among the disclosed changes and modifications, those which may not literally fall within the scope of the present claims constitute, therefore, a part of this invention in the sense of doctrine of equivalents.
While the invention will be described in connection with certain preferred embodiments, there is no intent to limit it to those embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents as included within the spirit and scope of the invention as defined by the appended claims.
A first embodiment of this invention is described hereinafter with reference to the drawings. This embodiment represents an example in which projection of a measurement pattern (including an orientation pattern) in preparation for measurement is utilized for reconstructing a measurement pattern for use in orientation or three-dimensional measurement, and also represents an example in which a color-coded target is used as a target (mark) to construct a measurement pattern.
The projection patterns include various patterns, such as a measurement pattern, an orientation pattern, a random pattern, a measurement preparation pattern, an overlap photographing range indication pattern and a texture light pattern. The measurement pattern P indicates measurement points Q (such as a position detection pattern) for use in three-dimensional measurement. The measurement points Q projected on a measuring object are used as measurement points of a three-dimensional shape. The orientation pattern indicates orientation points for use in orientation. The orientation points projected on the measuring object are photographed in stereo and used in orientation. There is no clear distinction between the measurement pattern and the orientation pattern, except that the former generally has more measurement points than orientation points of the latter. Generally, a pattern for use in three-dimensional measurement is called as a measurement pattern, while a pattern for use in orientation is called as an orientation pattern.
The random pattern is a type of measurement pattern with measurement points arranged at random. The measurement preparation pattern is used in a preparatory measurement to orientation or three-dimensional measurement. A grid pattern or a pattern with many small circles arranged in an array such as shown in
The overlap photographing range indication pattern indicates the overlapping range of a stereo image. Assuming left and right images of a stereo image with color-coded targets CT at the four corners such as shown in
The projection section 12 projects various patterns onto the measuring object 1. The photographing section 10 obtains a photographed image (which is typically a stereo image, but may also be a pair of single photographic images) of the measuring object 1. The photographing section 10 may, for example, include equipment of a measurement-purpose stereo camera or a general-purpose digital camera and a device for compensating for lens aberrations in an image of the measuring object 1 photographed by such cameras. The photographed image data storage section 13 stores a photographed image of the measuring object 1. The photographed image data storage section 13 stores, for example, a stereo image of the measuring object 1 photographed by the photographing section 10.
The correlating section 40 correlates a pair of photographed images or model images of the measuring object 1 to determine orientation or perform stereo matching. In case of using a stereo image of the measuring object 1, an orientation process is performed after a color-coded mark is extracted, a reference point is set, and a corresponding point is searched for. The correlating section 40 also performs stereo matching for three-dimensional measurement. The correlating section 40 includes an extraction section 41, a reference point setting section 42, a corresponding point search section 43, an orientation section 44, a corresponding point designating section 45, an identification code discrimination section 46, a pattern information storage section 47, a photographed/model image display section 48, a model image forming section 48A, a model image storage section 48B, and the calculation processing section 49. The extraction section 41, the identification code discrimination section 46, and the pattern information storage section 47 function also as the pattern detection section 491 of the calculation processing section 49. A matching processing section 70 plays an important role in stereo matching. The matching processing section 70 includes the reference point setting section 42, the corresponding point search section 43, and the corresponding point designating section 45.
The reference point setting section 42 searches the vicinity of a designated point on one image (reference image) of a stereo image for a point corresponding to a characteristic point, and sets the point corresponding to the characteristic point as a reference point. The characteristic point may be, for example, the center, the center of gravity, and the corners of the measuring object 1, a mark (target) affixed to or projected on the measuring object 1, etc. The corresponding point search section 43 determines a corresponding point that corresponds to the reference point set by the reference point setting section 42 and that is on the other image (search image) of the stereo image. When an operator designates a point in the vicinity of a characteristic point with the corresponding point designating section 45, the characteristic point intended by the operator can be snapped at by means of the reference point setting section 42 without the operator exactly designating the characteristic point, and a corresponding point in the search image can be determined by the corresponding point search section 43.
The orientation section 44 finds relationship as to corresponding points in a pair of images, such as a stereo image, using the reference point set by the reference point setting section 42 and the corresponding point determined by the corresponding point search section 43, and performs an orientation calculation process. The corresponding point designating section 45 determines a corresponding point on the search image in case where the operator designates a point outside the vicinity of a characteristic point on the reference image. The operator can easily recognize the correlation between characteristic points of the measuring object 1 by contrasting the positions on the display device 60 of the designated point on the reference image and of the corresponding point on the search image determined by the corresponding point designating section 45. The orientation section 44 also determines relative orientation using positional correspondence determined by the corresponding point designating section 45.
The calculation processing section 49 receives image data from the photographing section 10 and detects various patterns therefrom, and also generates various patterns to be projected from the projection section 12. The pattern detection section 491 detects the various patterns. The functions of the extraction section 41 and the identification code discrimination section 46 of the pattern detection section 491 will be described later with reference to
The model image forming section 48A forms a model image based on the parameters (the position and the tilt of the camera used in the photographing) obtained through the orientation calculation process by the orientation section 44. The model image, also called as rectified image, refers to a pair of left and right photographed images (stereo image) with their corresponding points rearranged on an identical epipolar line EP (see
The display image forming section 50 creates and displays a stereoscopic two-dimensional image of the measuring object 1 viewed from an arbitrary direction based on the three-dimensional coordinate data on the measuring object 1 and the photographed image or the model image of the measuring object 1. A three-dimensional coordinate data calculation section 51 calculates coordinates of three-dimensional positions of the measuring object 1, and a three-dimensional coordinate data storage section 53 stores the calculation results. A stereoscopic two-dimensional image forming section 54 forms a stereoscopic two-dimensional image based on the obtained three-dimensional coordinate data, and a stereoscopic two-dimensional image storage section 55 stores the resulting image. A stereoscopic two-dimensional image display section 57 displays on the display device 60 a stereoscopic two-dimensional image viewed from an arbitrary direction based on the information stored in the stereoscopic two-dimensional image storage section 55.
The retro target part P1 is used for detecting the target itself, the center of gravity thereof, the orientation (tilt) of the target, and the target area.
The reference color part P2 is used as a reference for relative comparison to deal with color deviation due to photographing conditions such as of lighting and camera, or used for color calibration to compensate for color deviation. In addition, the reference color part P2 can also be used for color correction of a color-coded target CT created in a simple way. For example, in case of using a color-coded target CT printed by a color printer (inkjet, laser or dye-sublimation printer, etc.) that is not color managed, individual variations in color occur depending on the printer that is used. However, the influence of such individual variations can be suppressed by relatively comparing the reference color part P2 and the color code part P3 and correcting their colors.
The color code part P3 expresses a code using a combination of colors distributed to respective unit areas. The number of codes that can be expressed changes with the number of code colors that can be used for codes. For example, in case where the number of code colors is “n”, the color-coded target CT1 of
The white part P4 is used for the detection of the direction of the color-coded target CT and calibration of color deviation. Of the four corners of the color-coded target CT, only one corner does not have a retro target, and that corner can be used for the detection of the direction of the color-coded target CT. That corner, or the white part P4, only needs to have a pattern different from the retro target. Thus, the white part may have printed therein a character string such as number for allowing visual confirmation of a code, or may be used as a code area for containing a barcode, etc. The white part may also be used as a template pattern for template matching to further increase detection accuracy.
The search processing section 110 detects a position detection pattern P1 such as retro target pattern from a color image (photographed image or model image) read from the photographed image data storage section 13 or the model image storage section 48B. In case where a template pattern instead of a retro target pattern is used as the position detection target, the template pattern is detected.
The retro target grouping processing section 120 groups those retro targets detected by the search processing section 110 and determined as belonging to the same color-coded target CT (for example, those with coordinates falling within the color-coded target CT) as candidates for retro targets belonging to the same group.
The color-coded target detection processing section 130 includes a color-coded target area/direction detection processing section 131 for detecting the area and the direction of a color-coded target CT based on a group of retro targets determined as belonging to the same color-coded target, a color detection processing section 311 for detecting the color arrangement in the reference color part P2 and the color code part P3 of a color-coded target CT and detecting the color of the measuring object 1 in an image, a color correction section 312 for correcting the color of the color code part P3 and the measuring object 1 in an image with reference to the reference color pattern P2, and a verification processing section 313 for verifying whether or not the grouping has been performed properly. The color correction section 312 corrects the color in the extracted photographed image, while the color modification section 494 modifies the color in the formed or selected projection pattern.
The image/color pattern storage section 140 includes a read image storage section 141 for storing an image (photographed image or model image) read by the extraction section 41, and a color-coded target correlation table 142 for storing a type-specific code number indicating the type of color-coded target CT for plural types of color-coded target CT expected to be used and for storing information on correlation between the pattern arrangement and the code number for each type of color-coded target CT.
The identification code discrimination section 46 discriminates an identification code based on the color arrangement in the color code part P3 for conversion into an identification code. The identification code discrimination section 46 includes a coordinate transformation processing section 321 for transforming the coordinate of a color-coded target CT based on the area and the direction of the color-coded target CT detected by the color-coded target detection processing section 130, and a code conversion processing section 322 for discriminating an identification code based on the color arrangement in the color code part P3 of the coordinate-transformed color-coded target CT for conversion into an identification code.
First, a color-coded target is affixed to a photographing object 1 (S01). The color-coded target may be provided by projection, in addition to or instead of affixation. The position where the color-coded target is affixed will be used as measurement points Q in orientation or three-dimensional measurement. Then, an image (typically, a stereo image) of the measuring object 1 is photographed using the photographing section 10 such as a digital camera (S10), and the photographed image is registered in the photographed image data storage section 13 (S11).
Then, returning to
Next, the process proceeds to the setting of a stereo pair. Of the images registered in the stereo image data storage section 13, a pair of left and right images are set as a stereo pair (S16) by utilizing the identification codes.
Next, the reference point setting section 42 searches for a point appropriate as a characteristic point in the vicinity of a point designated on one image (reference image) of a stereo image, and sets the point appropriate as the characteristic point as a reference point (S18). The corresponding point search section 43 determines a point corresponding to the reference point on the other image (search image) of the stereo image (S19).
Next, the orientation section 44 determines orientation (step S30). The orientation section 44 determines relative orientation of the stereo image of the measuring object 1 stored in the photographed image data storage section 13 to find relationship as to corresponding points of the stereo image with respect to the model image.
Here, the operator designates a point on a reference image with a mouse cursor or the like, and the reference point setting section 42 and the corresponding point search section 43 read the coordinates of a reference point appropriate as a characteristic point and those of a point corresponding to the designated point, to obtain corresponding points (identical points) on two or more images. Six or more corresponding points are normally required for each image. If three-dimensional coordinate data on the measuring object 1 separately measured by a three-dimensional position measurement device (not shown) are stored beforehand in the three-dimensional coordinate data storage section 53, the reference point coordinates and the images are correlated to determine absolute orientation. If not stored, relative orientation is determined.
For example, if an overlapping stereo image includes four color-coded targets CT each including three position detection patterns (retro targets), an orientation process can be performed based on the coordinates of the centers of gravity of the total of twelve position detection patterns (retro targets). Since orientation can be determined with six points at least, each color-coded target may only include two position detection patterns at least. In that case, orientation is determined using eight points. The orientation process can be performed automatically, manually or semi-automatically. In the semi-automatic orientation process, clicking the vicinity of a position detection pattern P1 in a color-coded target CT with a mouse triggers automatic position detection.
Then, for each image selected as a stereo pair, the orientation section 44 performs an orientation calculation process using the coordinates of the corresponding points. The position and the tilt of the left and right cameras that photographed the images, the positions of the corresponding points, and the measurement accuracy can be obtained in the orientation calculation process. In the orientation calculation process, relative orientation is determined to correlate a pair of photographed images or a pair of model images, while bundle adjustment is performed to determine orientation between plural or all images.
The model image forming section 48A forms a pair of model images based on the parameters determined by the orientation section 44 (S42), and the model image storage section 48B stores the model images formed by the model image forming section 48A (S43). The photographed/model image display section 48 displays these model images as a stereo image on the display device 60 (S44).
The model image, also called as rectified image, refers to a pair of left and right photographed images with their corresponding points rearranged on an identical epipolar line EP) so as to be viewed stereoscopically. A rectified image (model image) is created by a rectification process. The rectified image means an image that the epipolar lines EP of the left and right images are horizontally aligned with each other. Thus, as shown in
Then, a search is made for targets to be reference points RF on the same epipolar line EP (S140). In case of a rectified image, one-dimensional search on a single line is sufficient and hence the search is easy. In other cases, the search is made not only on the epipolar line but also on several lines around the epipolar line. If a reference point RF is found on an identical line as shown in
Next, turning to
Then, a description is made of the automatic determination of a stereo matching area. The corresponding point search section 43 automatically sets a matching range so as to include the color-coded targets CT located at the four corners of a stereo image as shown in
By determining matching areas in this way, overlap between model images can be secured as shown in
In case of fully automatic processing, with a large number of codes identified, photographing can be performed in an arbitrary order per pair of images (typically stereo image) as a base unit while securing overlap between adjacent images. With a fixed photographing order, automation is possible even with a small number of codes identified. In this case, only color-coded targets CT included in two (overlapping) images photographed in stereo need to be identified. A three-dimensional measurement (stereo measurement) is performed (S50) on an area where the matching area is determined (S45). For three-dimensional measurement, an image correlation process using a cross-correlation factor method is used, for example. The image correlation process is performed using the functions of the correlating section 40 (the extraction section 41, the reference point setting section 42, the corresponding point search section 43, etc.) and through calculation processing by the three-dimensional coordinate data calculation section 51.
The three-dimensional coordinates of the measuring object 1 are obtained through calculation processing by the three-dimensional coordinate data calculation section 51, and are stored in the three-dimensional coordinate data storage section 53. The stereoscopic two-dimensional image forming section 54 creates a stereoscopic two-dimensional image of the measuring object 1 based on the three-dimensional coordinates obtained by the three-dimensional coordinate data calculation section 51 or read from the three-dimensional coordinate data storage section 53, and the stereoscopic two-dimensional image storage section 55 stores the stereoscopic two-dimensional image. The stereoscopic two-dimensional image display section 57 displays on the display device 60 a stereoscopic two-dimensional image viewed from an arbitrary direction based on the information stored in the stereoscopic two-dimensional image storage 55.
Such a stereoscopic two-dimensional image of the measuring object 1 on the screen can show a perspective view thereof as viewed from an arbitrary direction, and also a wire-framed or texture-mapped image thereof. Texture-mapping refers to affixing texture that produces a stereoscopic effect to a two-dimensional image of the measuring object 1.
Automatic measurement can be performed in this way through photographing (S10) to three-dimensional measurement (S50), to obtain the three-dimensional coordinates of the measuring object 1 and display a stereoscopic image on the display device 60.
In this embodiment, the projection section (projector) 12 is utilized in the basic process flow described above to allow the following processes:
(a) The projector 12 lights up the range to be photographed by the camera, and the stereo camera 10 is adjusted to photograph the range.
Color-coded targets CT may be arranged at the four corners of a projection pattern, to indicate the photographing range (overlap photographing range) and to allow connection of adjacent images.
(b) The projector 12 projects texture light (only light), and the camera 10 photographs a stereo image pair as an image for texture of one model image (image of the measuring object).
(c) For preparation before measurement, the projector 12 projects a measurement preparation pattern, which is photographed in stereo. A grid pattern or a pattern with a large number of small circles arranged in an array such as shown in
Reference points RF may be affixed to the displaced points extracted from the preparation before measurement. Alternatively, other action may be taken such as increasing the number of measurement points Q (including orientation points). The size, number and arrangement of the orientation points can be calculated to reflect the calculation results in the actual pattern projection.
In the preparation before measurement, the check for displaced points may be performed along with approximate measurement. That is, a photographed image is sent via the pattern detection section 491 to the orientation section 44 to calculate orientation. When the number of measurement points for the preparation before measurement is large enough, the projected orientation points may be used as measurement points to complete the measurement process.
(d) In the orientation process, the projector 12 projects color-coded targets CT and reference points RF. Here, color-coded targets CT are affixed to irradiated positions. If already affixed in the preparation before measurement, color-coded targets CT are affixed to other points. The affixation is not necessary if the measurement is performed using the projected pattern. In such a case, the projected pattern is photographed in stereo and utilized again in the orientation process.
(e) In the three-dimensional measurement, a pattern for measurement is projected by the projector 12. In this case, a random pattern is irradiated for stereo matching, for example. Since the required accuracy for a pattern for measurement is calculated beforehand based on the camera condition, a pattern for measurement with the size satisfying the accuracy is irradiated. The irradiated pattern for measurement is photographed in stereo, and utilized in three-dimensional measurement.
(f) When moving on to a next photographing position, the projector 12 may approximately navigate to the next photographing position.
The above processes can be fully automated. In that case, the affixing work is not performed, but the preparation before measurement is performed, the orientation is determined and the three-dimensional measurement is performed, using only the projection pattern from the projector.
First, the photographing condition is input (S200). The photographing section 10 includes an optical system with a variable focal length. The photographing condition may be the camera parameters of the photographing section 10, such as the number of pixels of the digital camera used, the approximate pixel size, the focal length, the photographing distance, the baseline length and the overlap ratio. When any one of the camera parameters is input, the in-plane resolution, the depth resolution, the angle of view, the measurement area, etc. can be calculated. That is, the projection section 12 can set the range of a pattern to be projected, according to the range photographed by the photographing section 10. This allows adjustment of the arrangement and the density of measurement points in the preliminary measurement pattern. In addition, when the side lap ratio, the size of the area desired to be measured, etc. are input, the number of images to be photographed can be calculated. When the camera parameters, the required accuracy (pixel resolution), etc. are input, the photographing distance, the baseline length, etc. can be calculated.
Then, the camera parameters are calculated based on the input condition (S210). At this time, based on the condition, the in-plane pixel resolution, the depth resolution, the size of the measurement range in a stereo image pair, the number of images required to obtain an image of the entire measuring object, etc. are calculated.
The in-plane resolution and the depth resolution can be calculated by the following equations (where the asterisk “*” represents a multiplication operator):
Δxy(in-plane resolution)=δp(pixel size)*H(photographing distance)/f(focal length)
Δz(depth resolution)=δp*H*H/(f*B(baseline: inter-camera distance))
Then, the position and the projecting condition of the projector 12 are set to be consistent with the calculation results of the camera parameters of the photographing section 10 (S220).
Then, the projector 12 is switched to a photographing range indication mode, to project the range to be photographed by the camera 10 (S230). For the projection, light may be cast onto the range only to be photographed by the left and right stereo cameras 10. In this case, the range to be lighted up or indicated is automatically calculated from the condition input beforehand and the angle of view of the projector 12 used. The effective range where orientation can be determined and three-dimensional measurement can be performed is determined based on the overlapping range between the left and right photographing ranges. The overlap photographing range indication pattern indicates the overlapping range (overlapping part) between stereo images, and is formed as follows. The pattern projection control section 493 projects four color-coded targets CT, which are set to be arranged at the four corners of the overlapping range as shown in
Then, the camera position is set such that the projected range is photographed over approximately the entire screen (S240). At this time, the camera position is set such that the four color-coded targets CT in the overlap photographing range indication pattern are securely included in the left and right stereo photographing screens. Since the approximate camera position is already known from the condition input beforehand, such camera condition may be projected onto the measuring object for checking purposes.
Then, with the projection mode switched to the texture lighting mode, the projector 12 projects texture light (S245). The texture light does not have a pattern of shapes, but is uniform light cast onto an object. The texture light is also effective to consider which parts of the measuring object 1 targets are affixed to. In case where only light is cast onto the photographing range in the projection process (S230), this work is not necessary.
Then, one model (a stereo pair; two images) is photographed as an image for texture (first photographing; S250). When a texture image is not necessary, this photographing can be omitted. It is also possible to modify the color in the color-coded target CT to be projected by the projection section 12, based on the color obtained from the photographed image of the pattern projected in the texture lighting mode. The modification is performed by the pattern forming section 492 utilizing the color modification section 494, for example. The pattern projection control section 493 causes the projection section 12 to project the modified measurement pattern. The above processes should be performed before the process flow of
Next, the preparation before measurement (preliminary measurement) (S255) is described. The reason for performing the preparation before measurement is to determine actual orientation and perform actual three-dimensional measurement efficiently. Thus, the preparation before measurement is not necessarily performed for some objects. Once the preparation before measurement is performed on an object, it is not necessary for similar objects.
Since the photographing condition is input in the photographing condition input process (S200), the value may be used to calculate the size, number and arrangement of orientation points, so that the pattern forming section 492 can form a measurement pattern and the pattern projection control section 493 can cause the projection section 12 to project the formed measurement pattern. In addition, reference points RF, color-coded targets CT or objects of a different shape may be attached at or substituted for the intersections of a grid, the centers of gravity of small circles, etc. to form a measurement preparation pattern.
Then, displaced points in the pattern are checked (S310). The check is performed visually or by calculation. Since the purpose here is to preliminarily estimate the rough shape, visual check is sufficient in most cases. In case of calculation, the pattern detection section 491 detects displacement of the intersections of a grid or the centers of gravity of small circles based on the photographed image from the stereo camera 10. The intersections of a grid or the centers of gravity of small circles are included in the measurement points. For example, the intersections of a grid and the centers of gravity of small circles that are not equally spaced may be detected as displaced points (points where displacement occurs). In case of a small circle pattern, a center of gravity detection algorithm is used to detect the centers of gravity for position measurement. In this way, assuming the measurement preparation pattern as a first measurement pattern P and the intersections of a grid or the centers of gravity of small circles as measurement points Q, the pattern detection section 491 can detect displacement of the measurement points in the first measurement pattern.
When displaced points (points where displacement occurs) are detected, reference points are affixed to the displaced points, or reference points are added to the measurement preparation pattern (S320). In case of a visual check, reference points may be affixed at the moment when displaced points are checked. In case of automated processing, reference points are added to the measurement preparation pattern according to the magnitude of displacement, that is, the magnitude of deformation. As a result of the preparation before measurement described above, displaced points in the pattern can be found beforehand, allowing targets to be affixed to the measuring object 1 as reference points. Also, a projection pattern added with reference points can be created, or reference points in the vicinity of the displaced points can be increased. This allows effective orientation and three-dimensional measurement. To create a projection pattern added with reference points, the pattern forming section 492 forms a second measurement pattern added with measurement points based on the displacement of measurement points in the first measurement pattern detected by the pattern detection section 491.
Then, turning to the flowchart of
Then, stereo photographing is performed (second photographing; S270). This process corresponds to S10 of
Then, with the projection mode switched to a random pattern mode, for example, a measurement pattern for three-dimensional measurement is projected (S280). This process corresponds to S01 of
Then, stereo photographing is performed (third photographing; S290). This process corresponds to S10 of
When measurement is performed at this position, the position is moved to a next photographing position (S298). That is, the process returns to S220 (in some cases, to S200) to repeat photographing until three-dimensional data on the entire measuring object can be obtained.
At this time, the projector may navigate to the next photographing position. The term “navigate” refers to, for example, selecting the number and arrangement of orientation points based on how the projected grid pattern is distorted and performing rough measurement, to consider the arrangement of orientation points or to search a mismatch area and consider increasing orientation points in the area. That is, the navigation results may determine the position where the pattern is affixed or projected.
In forming the second measurement pattern, the measurement points may be reduced or changed. For example, the reference points may be changed to color-coded targets, the color code patterns of the color-coded targets may be changed, or measurement points in the vicinity of characteristic points may be moved to the characteristic points. The reference points may be reduced, or bad orientation points may be deleted, in order to return to the previous stage to perform measurement.
The processes of
Next, description is made of the color-coded target detection flow. Detection of color-coded targets is performed manually or automatically. When performed automatically, the process may be performed differently depending on the number of colors identified in the color-coded targets CT or the photographing method. First of all, description is made of the case where a large number of colors are identified in the color-coded targets CT. In this case, there is no restriction on the photographing order, allowing fully automatic processing.
First, color images to be processed (photographed images or model images) are read into the read image storage section 141 of the extraction section 41 (S500). Then, color-coded targets CT are extracted from each read image (S510).
Various search methods may be used such as (1) to search for a position detection pattern (retro target) P1 in a color-coded target CT, (2) to detect the chromatic dispersion of a color code part P3, (3) to use a colored position detection pattern, etc.
(1) In case where the color-coded target CT includes a retro target, that is, in case where a pattern with a sharp contrast in brightness is used, the retro target can be easily detected by photographing the object with a camera stopping down the aperture and using a flash to obtain an image in which only the retro target is gleaming, and binarizing the obtained image.
When the range where the target lies is determined, its center of gravity is calculated by, for example, the method of moments. For example, the retro target 200 shown in
xg={Σx*f(x,y)}/Σf(x,y) [Equation 1]
yg={Σy*f(x,y) }/Σf(x,y) [Equation 2]
where (xg, yg) represents the coordinates of the center of gravity, and f(x, y) represents the brightness value at coordinates (x, y).
In case where a retro target 200 shown in
In this way, the center of gravity of the retro target 200 can be found.
(2) Normally, a color code part of a color-coded target CT uses a large number of code colors and has a large chromatic dispersion value. Thus, a color-coded target CT can be detected by finding a part with a large dispersion value from an image.
(3) Retro targets at three corners of a color-coded target CT are given different colors so that the respective retro targets reflect different colors. Since retrotargets at three corners are given different colors, the respective retro targets of the color-coded target can be easily discriminated. In grouping retro targets, even though there are many retro targets to be used, the grouping process can be made easy by selecting most closely located retro targets of different colors as candidates for retro targets of a group.
In case of using a large number of retro targets as reference points RF, retro targets of color-coded targets CT and retro targets as separate units exist as mixed. In such a case, colored retro targets may be used in color-coded targets and white retro targets may be used as separate units, allowing easy discrimination.
Here, an example of the case (1) is described. In
In addition, the pattern of the detected color-coded target CT may be compared with the color-coded target correlation table 142 to verify which type of color-coded target it is.
Then, the area/direction detection processing section 131 of the color-coded target detection processing section 130 finds the area and the direction of the color-coded target CT by a group of retro targets based on the centers of gravity of the retro targets stored in the read image storage section 141 (S530). Before or after the area and the direction are determined, the color detection processing section 311 detects the colors of the reference color part P2, the color code part P3, and the measuring object 1 in the image. If necessary, the color correction section 312 may correct the colors of the color code part P3 and the measuring object 1 in the image with reference to the color of the reference color part P2. In case where a color-coded target printed in a color which can not be used as a reference is used, its reference color part is also corrected. Then, the verification processing section 313 verifies whether or not the grouping has been performed properly, that is, whether or not the centers of gravity of the retro targets once grouped into the same group do belong to the same color-coded target CT. If they are discriminated as belonging to the same group, the process proceeds to the next, identification code determination process (S535), and if not, the process returns to the grouping process (S520).
For labeling, a triangle is created using as its vertexes the centers of gravity R1 to R3 of the subject three retro targets (S600). One of the centers of gravity R1 to R3 of the three retro targets is selected arbitrarily and labeled tentatively as T1 (S610), and the remaining two centers of gravity are labeled tentatively as T2 and T3 clockwise (S612; see
Then, the interior of the triangle is scanned in the manner of an arc to obtain the values of pixels distanced by a radius R from each vertex (center of gravity) in order to see changes in color over the scanned range (see
Scanning is performed clockwise from L12 to L31 on the center of gravity T1, clockwise from L23 to L12 on the center of gravity T2, and clockwise from L31 to L23 on the center of gravity T3 (S620 to S625).
The radius is determined by multiplying the size of the retro target on the image by a multiplication factor depending on the scanning angle. In case where the retro target is photographed from an oblique direction and hence looks oval, the scanning range is also determined as oval. The multiplication factor is determined according to the size of the retro target and the distance between the center of gravity of the retro target and the reference color part P2.
The process of verifying the labeling is performed by the verification processing section 313. The center of gravity with changes in color as a result of scanning is labeled as R1, and the remaining two centers of gravity are labeled clockwise from the center of gravity with changes in color as R2 and R3 (S630 to S632). In this example, the center of gravity T2 is labeled as R1, the center of gravity T3 as R2, and the center of gravity T1 as R3. If one center of gravity with changes in color is detected and two centers of gravity with no changes in color are not detected, it is determined as a grouping error of retro targets (S633), three retro targets are selected again (S634), and the process returns to S600. As described above, it is possible to verify whether or not the three selected retro targets belong to the same color-coded target CT1 based on the process results. In this way, the grouping of retro targets is established.
The above labeling method is described taking the color-coded target CT1 of
Turning to
This process flow is described with reference to
Then, it is checked whether or not a white part P4 is located on the coordinate-transformed color-coded target CT1 as specified by the design values (S650). If not located as specified by the design values, it is determined as a detection error (S633). If a white part P4 is located as specified by the design values, it is determined that a color-coded target CT1 has been detected (S655).
Then, the color code of the color-corrected color-coded target CT1 with known area and direction is discriminated.
The color code part P3 expresses a code using a combination of colors distributed to respective unit areas. For example, in case where the number of code colors is “n” and there are three unit areas, n×n×n codes can be expressed. Under the condition that the unit areas do not have redundant colors, n×(n−1)×(n−2) codes can be expressed. Under the condition that there are “n” unit areas and they do not use redundant colors, n factorial kinds of codes can be expressed.
The code conversion processing section 322 of the identification code discrimination section 46 compares the combination of colors of the unit areas in the color code part P3 with the combination of colors in the color-coded target correlation table 142 to discriminate an identification code.
There are two ways to discriminate colors: (1) a relative comparison method by comparison between the colors of the reference color part P2 and the colors of the color code part P3, and (2) an absolute comparison method by correcting the colors of the color-coded target CT1 using the colors of the reference color part P2 and the color of the white part P4, and discriminating the code of the color code part P3 based on the corrected colors. For example, in case where a small number of colors are used in the color code part P3, the reference colors are used as colors to be compared with for relative comparison, and in case where a large number of colors are used in the color code part P3, the reference colors are used as colors for calibration purposes to correct the colors, or as colors to be compared with for absolute comparison. As described before, the color detection processing section 311 performs color detection, and the color correction section 312 performs color correction.
The code conversion processing section 322 of the identification code discrimination section 46 detects the reference color part P2 and the color code part P3 using either color discrimination method (1) or (2) (S660, S670), discriminates the colors of the color code part P3 (S535 of
First, a pattern storage section 495 (see
By utilizing the projection device to perform preparation before measurement and reconstruct a target pattern for use in orientation or three-dimensional measurement in this way, non-contact three-dimensional measurement can be performed appropriately and automatically on various objects.
First, the pattern forming section 492 forms a measurement pattern including color-coded targets CT having a position detection pattern P1 for indicating a measurement position, and a color code pattern P3 colored with plural colors to allow identification of the targets (pattern forming process; S810). Then, the pattern projection control section 493 causes the projection section 12 to project the measurement pattern formed in the pattern forming process (projection process; S840). Then, the photographing section 10 photographs the measurement pattern projected in the projection process (photographing process; S850). Then, the pattern detection section 492 detects the position detection pattern P1 and the color code pattern P3 from an image of the measurement pattern photographed in the photographing process to identify a color code (pattern detection process; S860).
The use of color-coded targets in this way allows easy identification of the respective targets and automatic connection of images of the measuring object over a wide area, thereby improving the efficiency of and promoting the automation of orientation and three-dimensional measurement.
An example in which the measurement pattern is formed by the pattern forming section has been described in the first embodiment. Now, the following describes an example in which plural measurement patterns are stored in the pattern storage section, and the measurement pattern most appropriate for the condition is selected by the pattern selection section and projected. Also, the pattern storage section can store the measurement pattern formed by the pattern forming section.
The pattern storage section 495 stores measurement patterns including a color-coded target CT and a monochrome target pattern. These may be of various arrangements and colors. The pattern selection section 496 suitably selects a measurement pattern to be projected, out of the various measurement patterns stored in the pattern storage section 495. The pattern projection control section 493 causes the projection section 12 to project the measurement pattern selected by the pattern selection section 496. The pattern storage section 495 may store pattern elements such as a color-coded target and a monochrome target pattern, and the pattern forming section 492 may edit or form a pattern using these elements. The measurement pattern formed by the pattern forming section 492 may be stored in the pattern storage section 495, so that the pattern projection control section 493 can cause the projection section 12 to project the measurement pattern formed by the pattern forming section 492.
First, the pattern storage section 495 stores plural measurement patterns indicating measurement points on the surface of the measuring object (pattern storage process; S710). Then, the pattern projection control section 493 causes the projection section 12 to project one of the plural measurement patterns as a first measurement pattern (first projection process; S720). Then, the photographing section 10 photographs the first measurement pattern projected in the projection process (photographing process; S730). Then, the pattern detection section 491 detects the measurement points from an image of the first measurement pattern photographed in the photographing process (pattern detection process; S740). Then, the pattern detection section 491 detects displacement of the measurement points in the first measurement pattern detected in the pattern detection process (displacement detection process; S750). Then, the pattern selection section 496 selects, based on the detected displacement of the measurement points in the first measurement pattern, a third measurement pattern where the measurement points are increased, deleted or changed (pattern selection process; S780). Then, the pattern projection control section 493 causes the projection section 12 to project the third measurement pattern (third projection process; S790).
First, the pattern storage section 495 stores plural measurement patterns including color-coded targets CT having a position detection pattern P1 for indicating a measurement position, and a color code pattern P3 colored with plural colors to allow identification of the target (pattern storage process; S820). Then, the pattern selection section 496 selects a measurement pattern to be projected, out of the plural measurement patterns stored in the pattern storage process (pattern selection process; S830). Then, the pattern projection control section 493 causes the projection section 12 to project the measurement pattern selected in the pattern selection process (projection process; S840). Then, the photographing section 10 photographs the measurement pattern projected in the projection process (photographing process; S850). Then, the pattern detection section 492 detects the position detection pattern P1 and the color code pattern P3 from an image of the measurement pattern photographed in the photographing process to identify a color code (pattern detection process; S860).
In this embodiment, an example is described in which color-coded targets are not used, but only ordinary reference points (retro targets and templates) are used. These reference points include only position detection patterns, and black-and-white retro targets as shown in
In the system structure of this embodiment, the extraction section 41 can be simplified so as to include only the search processing section 110, and the identification code discrimination section 46 can be omitted, compared to those of the first and second embodiments shown in
In this embodiment, a method is described in which measurement or approximate measurement is performed at the stage of preparation before measurement (S255: see
If the number of measurement points (including orientation points) in the measurement preparation pattern should be large enough, the projected orientation points may be used as measurement points to complete the measurement process. In this case, the processes after S260 of
In case where more strictness is required, approximate surface measurement may be performed.
Here, if tie points (for connection purposes; color-coded targets) are affixed beforehand, the process flow of preparation before measurement (S300 to S320) can be repeated to complete measurement of this area, not processing before measurement.
In this embodiment, the number of times of pattern projection is increased. The system structure of this embodiment is the same as that in the first or third embodiment. For example, orientation points may be projected plural times, or measurement points may be projected plural times. In case where mismatching occurs in measurement, orientation points may be increased and projected again for further measurement.
This invention may be implemented as a computer-readable program which causes a computer to execute a method for projecting a three-dimensional measurement pattern or a three-dimensional measurement method described in the embodiments described above. The program may be stored in a built-in memory of the calculation processing section 49, stored in a storage device disposed internally or externally to the system, or downloaded via the Internet. This invention may also be implemented as a storage medium storing the program.
The three-dimensional measurement system or the color-coded target according to this invention described above may also be used as follows.
In the projection device for three-dimensional measurement described above according to the invention, the photographing section 10 may be of a variable focal length, and the projection section 12 may be able to set the projection range of the measurement pattern P according to the photographing range set with the photographing section 10. With this constitution, an appropriate projection range can be set according to the focal length, etc. of the photographing section.
In the projection device for three-dimensional measurement described above according to the invention, the pattern projection control section 493 may cause the projection section 10 to cast uniform light for obtaining texture onto the measuring object. With this constitution, the three-dimensional shape of the measuring object can be approximately grasped, and utilized to design a second measurement pattern or to select a third measurement pattern.
In the projection device for three-dimensional measurement described above according to the invention, the pattern projection control section 493 may be able to adjust the arrangement of measurement points Q in the measurement pattern P and the pattern density when any one of the focal length, the photographing distance, the baseline length and the overlap ratio of the photographing section 10 is input. With this constitution, an appropriate measurement pattern can be selected according to the focal length, etc. of the photographing section.
In a three-dimensional measurement system having the projection device for three-dimensional measurement described above according to the invention, the photographed image may be a stereo image pair, and a matching processing section 70 for performing a pattern matching process of the stereo photographed image may be provided. The matching processing section 70 may perform the pattern matching process using the photographed image of a first measurement pattern projected, and the pattern forming section 492 may add measurement points Q to areas in the first measurement pattern corresponding to bad areas on the photographed image detected in the matching process, to form a second measurement pattern or a third measurement pattern.
Here, the bad areas detected in the matching process refer to areas in which the coordinates of measurement points on the photographed image are greatly different, while the coordinates of most measurement points are in agreement or minimally different, in the matching process of the stereo image. In these areas, accurate measurement has not been performed, and accurate measurement becomes possible by increasing the measurement points. With this constitution, accurate measurement can be achieved with a smaller number of repetitions.
The invention may be implemented as a three-dimensional measurement system having the projection device for three-dimensional measurement described above. With this constitution, the measurement pattern can be optimized, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes from projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
The method for projecting a three-dimensional measurement pattern according to the invention may include, as shown for example in
With this constitution, the measurement pattern can be optimized, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes from projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
The method for projecting a three-dimensional measurement pattern according to the invention may include, as shown for example in
With this constitution, the measurement pattern can be optimized, thereby improving the efficiency of orientation and three-dimensional measurement using the optimized measurement pattern. Also, the processes of projection of a measurement pattern to detection of it can be automated, thereby promoting the automation of orientation and three-dimensional measurement.
In the method for projecting a three-dimensional measurement pattern according to the invention, the image photographed in the photographing process S730 may be a stereo image pair. The method may include, as shown for example in
Here, the reference points added to the second measurement pattern or the third measurement pattern may be projected and used for photographing as they are, or used for photographing as affixed at the points projected on the measuring object. With this constitution, reference points can be sequentially increased in the orientation process or the three-dimensional measurement process to proceed to accurate orientation or accurate measurement.
In the projection device for three-dimensional measurement described above according to the invention, the pattern forming section 492 may form a monochrome target pattern including only position detection patterns. With this constitution, the color code pattern may be used for the measurement of reference points and the monochrome target pattern may be used for accurate measurement, for example, thereby improving the efficiency of measurement.
In the projection device for three-dimensional measurement described above according to the invention, the pattern storage section 495 may store a monochrome target pattern including only position detection patterns. With this constitution, the color code pattern may be used for the measurement of reference points and the monochrome target pattern may be used for accurate measurement, for example, thereby improving the efficiency of measurement.
The projection device for three-dimensional measurement described above according to the invention may include a pattern projection control section 493 for controlling the projection section 12 to project a measurement pattern. The pattern projection control section 493 may cause the projection section 12 to project a random pattern in which position detection patterns are arranged at random. It may be possible to switch between a measurement mode in which the measurement pattern is projected, and a random pattern mode in which the random pattern is projected. With this constitution, it is possible to easily switch the orientation and the three-dimensional measurement, for example.
The projection device for three-dimensional measurement described above according to the invention may include a pattern projection control section 493 for controlling the projection section 12 to project a measurement pattern. The pattern projection control section 493 may cause the projection section 12 to project an overlap photographing range indication pattern indicating the overlapping range of a stereo image. It may be possible to switch between a measurement mode in which the measurement pattern is projected, and a photographing range indication mode in which the overlap photographing range indication pattern is projected. With this constitution, it is possible to easily switch between the orientation and the setting of a photographing range, for example.
The projection device for three-dimensional measurement described above according to the invention may include a pattern projection control section 493 for controlling the projection section 12 to project a measurement pattern. The pattern projection control section 493 may be able to adjust the arrangement of measurement points and the pattern density in the measurement pattern when any one of the focal length, the photographing distance, the baseline length and the overlap ratio of the photographing section 10 is input. Here, the measurement points include orientation points. With this constitution, an appropriate measurement pattern can be selected according to the focal length, etc. of the photographing section.
The projection device for three-dimensional measurement described above according to the invention may include a pattern projection control section 493 for controlling the projection section 12 to project a measurement pattern. The pattern projection control section 493 may cause the projection section 12 to cast uniform light for obtaining texture onto the measuring object. It may be possible to switch between a measurement mode in which the measurement pattern is projected, and a texture lighting mode in which the light for obtaining texture is cast. With this constitution, the three-dimensional shape of the measuring object can be approximately grasped through the texture lighting mode.
In the projection device for three-dimensional measurement described above according to the invention, the pattern detection section 491 may include a color modification section 494 for modifying the color in the color-coded target CT to be projected by the projection section 12, based on the color obtained from the photographed image of the pattern projected in the texture lighting mode. With this constitution, the color in the color-coded target can be modified according to the brightness or darkness in the photographed image, thereby facilitating identification of a color code.
The three-dimensional measurement system 100 according to the invention may include the projection device for three-dimensional measurement described above. With this constitution, projection of a color-coded target can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
The method for projecting a three-dimensional measurement pattern according to the invention may include, as shown for example in
With this constitution, identification of respective color-coded targets can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
The method for projecting a three-dimensional measurement pattern according to the invention may include, as shown for example in
With this constitution, identification of respective color-coded targets can facilitate, and also automate, searching a stereo image for corresponding points, connecting adjacent images, and setting a stereo matching area. This also can improve the efficiency of and promotes the automation of orientation and three-dimensional measurement.
In the method for projecting a three-dimensional measurement pattern described above according to the invention, the pattern forming process S810 may form a monochrome target pattern including only a position detection pattern, and the pattern detection process S860 may detect the monochrome target pattern. With this constitution, the color code pattern may be used for the measurement of reference points and the monochrome target pattern may be used for accurate measurement, for example, thereby improving the efficiency of measurement.
In the method for projecting a three-dimensional measurement pattern described above according to the invention, the pattern storage process S820 may store a monochrome target pattern including only a position detection pattern, and the pattern detection process S860 may detect the monochrome target pattern. With this constitution, the color code pattern may be used for the measurement of reference points and the monochrome target pattern may be used for accurate measurement, for example, thereby improving the efficiency of measurement.
In the projection method for three-dimensional measurement described above according to the invention, the image photographed in the photographing process S850 may be a stereo image pair. The method may include an orientation process S30 for determining orientation of the stereo image, and a three-dimensional measurement process S50 for measuring the three-dimensional shape of the measuring object. In the orientation process S30 or the three-dimensional measurement process S50, the color-coded targets CT may be projected as measurement points indicating the reference positions for measurement, and the monochrome target patterns may be projected as reference points. At the measurement points indicating the reference positions for measurement, target patterns may be projected and used for photographing as they are, or target patterns may be affixed and used for photographing. This constitution can improve the efficiency of measurement.
Embodiments of this invention have been described above. It should be understood that the invention is not limited to the embodiments described above, but various modifications can be apparently made to the embodiments without departing from the scope of the invention. For example, in the above embodiments, the measurement points are increased when forming a second measurement pattern. However, the measurement points may be reduced or changed. The constitution of the color-coded target may be different from those of
A series of images may be photographed such that each photographed image includes four color-coded targets CT and adjacent photographed images share two color-coded targets. The arrangement of the series of photographed images may be determined automatically such that the identification codes of the color-coded targets CT shared by adjacent photographed images coincide with each other. The stereo camera, the projector and the calculation processing section may be constituted integrally with or separately from each other. The pattern detection section of the calculation processing section may be constituted separately from, rather than commonly to, as in the above embodiments, the extraction section, the reference point setting section, the corresponding point search section, etc. within the correlating section.
The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) is to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising”, “having”, “including” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted or clearly contradicted by context. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate, and the inventors intend for the invention to be practiced otherwise than as specifically described herein. Accordingly, this invention includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.
This invention is applicable to a system and method for three-dimensionally measuring an object in a non-contact manner.
The main reference numerals and symbols are described as follows:
Number | Date | Country | Kind |
---|---|---|---|
2005-289332 | Sep 2005 | JP | national |
2005-289333 | Sep 2005 | JP | national |