The present invention relates to an information processing apparatus, a control method therefor, and a computer-readable storage medium.
In the field of industrial machine vision, three dimensional measurement techniques are known as technology components. A method of performing three dimensional measurement using machine vision will briefly be described hereinafter. First, a target object to be measured is irradiated with light which bears the information of a two dimensional pattern, and an image of the target object to be measured onto which the two dimensional pattern is projected is captured by a camera. Then, based on the periodicity of the two dimensional pattern, the captured image is analyzed to obtain distance information to the target object to be measured. This distance information indicates the distance from the camera to the target object to be measured or the depth of, for example, the three dimensional surface structure. Because information in the widthwise and height directions can be obtained from the two dimensional captured image, three dimensional space information is obtained at this time. Lastly, three dimensional model fitting is performed using the two dimensional captured image, the distance information, and the model information of the target object to be measured held in advance, thereby measuring the position, orientation, and three dimensional shape of the target object to be measured.
This technique is often used to, for example, pick up or assemble components by a robot arm in a factory manufacturing line. The positions, orientations, and three dimensional shapes of components are measured using the three dimensional measurement technique, and the robot arm is controlled based on the obtained information, thereby allowing the robot arm to efficiently, accurately pick up and assemble the components.
As a three dimensional method which uses a two dimensional pattern, the spatial coding method or the phase shift method, for example, is available. These methods are effective because they can simultaneously be used for an image recognition process. Also, a pattern projection operation which uses a projector can variably project different patterns, and is therefore effective in a three dimensional measurement method, which requires a plurality of patterns, such as the spatial coding method or the phase shift method. Note that the projector can project different patterns upon switching between them at a frame rate of 30 fps to 60 fps or more. Because the camera can capture an image at a high frame rate as well, and the projector and camera resolutions have improved, three dimensional measurement can be performed with high speed and high accuracy as long as different patterns can variably be measured for each frame.
Japanese Patent Laid-Open No. 2009-186404 discloses a technique of synchronizing an operation of turning on an illumination light source to illuminate an object in obtaining two dimensional image information, and an operation of turning on a projection light source to project a geometric pattern onto the object in obtaining three dimensional image information.
Japanese Patent No. 2997245 discloses a technique of sequentially switching between a plurality of pattern masks, and making an electronic flash light source emit light for every switching operation, thereby capturing an image.
Japanese Patent Laid-Open No. 7-234929 discloses a technique in which assuming that a CCD (Charge-Coupled Device) is used as an image sensor, the image input period (full-pixel simultaneous input) and the image output period are clearly separated, and the projection pattern is switched during the image output period.
Unfortunately, the above-mentioned related art techniques pose the following problem. A high-pressure mercury lamp is the current mainstream projector light source. Unlike a halogen lamp, the mercury lamp uses no filaments, and therefore has a relatively long life, but requires component replacement every few months when it is always kept ON for the industrial purpose. Also, it takes much time for the mercury lamp to become stable after turn-on, so the mercury lamp must be kept ON during the required process time once it is turned on. This is disadvantageous in terms of not only wastefully keeping the mercury lamp ON during a time other than the measurement time, but also making it necessary to suppress an increase in temperature as the mercury lamp must be kept ON for a long time.
Furthermore, none of Japanese Patent Laid-Open No. 2009-186404, Japanese Patent No. 2997245, and Japanese Patent Laid-Open No. 7-234929 accurately control light emission within one frame in accordance with the projection and image capture characteristics.
Conventionally, a light source is kept ON because a high-pressure mercury lamp is used as the light source, but the use of a variable ON/OFF light source (for example, an LED light source) allows operations as in the related art techniques. Further, a measurement apparatus which uses an optimum light source effective in terms of both heat removal and energy saving can be achieved by finer LED ON/OFF control, in which the LED is turned on only within the period required for measurement.
The present invention has been made in consideration of the above-mentioned problem, and provides a technique of shortening the ON time of a light source to prevent an increase in temperature of the light source, thereby allowing life prolongation and power saving of the light source.
According to one aspect of the present invention, there is provided an information processing apparatus comprising: projection means for projecting a projection pattern generated by a display device onto a target object by turning on a variable ON/OFF light source; image capture means for capturing an image of the target object onto which the projection pattern is projected; calculation means for calculating an image capture period of the image capture means based on a response property of the display device and image capture characteristics of the image capture means; and control means for controlling to synchronize the image capture period of the image capture means and a projection period of the projection means by keeping the light source ON during the image capture period.
According to one aspect of the present invention, there is provided a control method for an information processing apparatus including projection means, image capture means, calculation means, and control means, the method comprising: a projection step of causing the projection means to project a projection pattern generated by a display device onto a target object by turning on a variable ON/OFF light source; an image capture step of causing the image capture means to capture an image of the target object onto which the projection pattern is projected; a calculation step of causing the calculation means to calculate an image capture period of the image capture means based on a response property of the display device and image capture characteristics of the image capture means; and a control step of causing the control means to control to synchronize the image capture period of the image capture means and a projection period of the projection means by keeping the light source ON during the image capture period.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
An exemplary embodiment(s) of the present invention will now be described in detail with reference to the drawings. It should be noted that the relative arrangement of the components, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
The configuration of a three dimensional measurement apparatus which functions as an information processing apparatus in the first embodiment will be described with reference to
The overall control unit 101 includes a measurement pattern output unit 101-1, projection/image capture synchronization information management unit 101-2, measurement image processing unit 101-3, and ON information management unit 101-4. The projection unit 102 includes a light source unit 102-1. The synchronization control unit 104 includes a light source control unit 104-1.
The overall control unit 101 controls each processing unit including the measurement pattern output unit 101-1, projection/image capture synchronization information management unit 101-2, measurement image processing unit 101-3, and ON information management unit 101-4, and controls information exchange between the overall control unit 101 and the projection unit 102, image capture unit 103, or synchronization control unit 104.
The measurement pattern output unit 101-1 generates a projection pattern image to be used for measurement. The measurement pattern output unit 101-1 transmits the projection pattern image to the projection unit 102.
The projection/image capture synchronization information management unit 101-2 stores, in advance, and manages synchronization information for synchronizing the projection start timing of the projection unit 102 and the image capture start timing of the image capture unit 103. The projection/image capture synchronization information management unit 101-2 reads out the synchronization information, and transmits it to the light source control unit 104-1.
The measurement image processing unit 101-3 receives and analyzes image data captured by the image capture unit 103 to extract pattern edge position information. The measurement image processing unit 101-3 then creates a distance information map to a target object to be measured using the principle of triangulation, based on the baseline length between the projection unit 102 and the image capture unit 103, and the distance to the target object to be measured.
The ON information management unit 101-4 generates ON information for controlling the ON timing of the variable ON/OFF light source unit 102-1 of the projection unit 102. The ON information management unit 101-4 transmits the ON information to the synchronization control unit 104.
The projection unit 102 projects a projection pattern onto the target object to be measured. The image capture unit 103 captures an image of the target object to be measured onto which the projection pattern is projected by the projection unit 102.
The synchronization control unit 104 adjusts the start timing of projection by the projection unit 102, and the start timing of image capture by the image capture unit 103, based on the received ON information and synchronization information, thereby controlling their synchronization, in order to control the projection unit 102 and the image capture unit 103 at high speed for each frame in three dimensional measurement by pattern projection.
More specifically, the overall control unit 101 controls the measurement pattern output unit 101-1 to transmit a projection pattern image which is generated by the measurement pattern output unit 101-1 and to be used for measurement to the projection unit 102. The overall control unit 101 also controls the ON information management unit 101-4 to transmit ON information generated by the ON information management unit 101-4 to the synchronization control unit 104.
The projection unit 102 receives the projection pattern image from the measurement pattern output unit 101-1, and drives a display unit (not shown). The projection unit 102 also receives the ON information from the synchronization control unit 104. The projection unit 102 then projects the projection pattern onto the target object to be measured after turn-on of the light source unit 102-1.
The image capture unit 103 captures an image of the target object that is to be measured onto which the projection pattern is projected, and transmits the captured image to the overall control unit 101. The overall control unit 101 receives image data captured by the image capture unit 103. The measurement image processing unit 101-3 analyzes the received image data to extract pattern edge position information. The measurement image processing unit 101-3 then creates a distance information map to the target object to be measured using the principle of triangulation, based on the baseline length between the projection unit 102 and the image capture unit 103, and the distance to the target object to be measured.
Although the horizontal scanning direction of projection by the light source unit 102-1 and that of image capture by the image capture unit 103 are set to be the same in
However, if an image is captured using the line-sequential display scheme or the rolling shutter scheme, the sub-scanning direction of the projection unit 102 (that is, a direction perpendicular to the horizontal scanning direction and in which line scanning sequentially progresses when one frame is formed by a plurality of lines) and the sub-scanning direction of the image capture unit 103 are set to be the same. This makes it possible to reduce the difference in scanning position between scanning by the projection unit and scanning by the image capture unit due to the elapse of time.
The image capture unit 103 also has a function for capturing an image upon setting the image capture area to an arbitrary partial area that falls within the range of the projection area of the projection unit 102 by ROI (Region Of Interest) control, as shown in
As the projector, a liquid crystal display device type projector, for example, is widely used. A liquid crystal projector generally adopts the active matrix driving scheme. In the active matrix driving scheme, scanning voltages are sequentially applied to scanning lines (signal lines) for each horizontal scanning period to, in turn, sequentially apply predetermined voltages to corresponding pixel electrodes so as to drive the liquid crystal projector, thereby constructing a display image.
Note that driving schemes are roughly classified into the frame-sequential driving scheme and the line-sequential driving scheme, depending on a method of applying a voltage to a signal line. The frame-sequential driving scheme is used to apply a voltage to a signal line corresponding to an input video signal, while the line-sequential driving scheme is used to apply a voltage to a signal corresponding to each video signal at once after a video signal of one line is temporarily latched. The line-sequential driving scheme is commonly used. However, in the line-sequential driving scheme, a pattern cannot simultaneously be projected to the entire field, so a display image having lines set in different driven states is obtained at a certain time point. Further, a mixture of the previous frame image and the current frame image is displayed within one field, so it is very difficult to use the line-sequential driving scheme in projecting a plurality of patterns.
As can be seen from the foregoing description, the temporal change in projection luminance of each line is nearly the same, but a change in amount of light occurs depending on the projection position at the same time instant, because the projection start time varies in each individual line. This temporal change in projection luminance occurs due to factors associated with the response property of the liquid crystal device.
In the active matrix driving scheme, a voltage applied to a gate electrode line turns on all FETS (Field Effect Transistors) of one column connected to it, so a current is supplied between their sources and drains, the voltage applied to each source electrode line at that time is applied to a liquid crystal electrode, and a charge corresponding to the voltage is stored in a corresponding capacitor. After the charging operation of one column via the gate electrode line ends, the voltage application sequence shifts to the next column, and the FETs of the first column are turned off upon losing their gate voltages. However, although the liquid crystal electrodes of the first column lose their voltages from the source electrode lines, they can keep almost the required voltages during the period corresponding to one frame until the next gate electrode line is selected. Note that the response time of a liquid crystal panel is longer than that of a cathode-ray tube or PDP (Plasma Display Panel), which is on the order of about 1 microsecond. This is because in the liquid crystal panel, a physical change in orientation of a liquid-phase liquid crystal substance is used in display and, more specifically, a lag of a change in orientation is determined upon defining the liquid crystal viscosity and layer thickness as the main parameters. The period from the start of projection until a predetermined amount of light is reached can be calculated based on the response property of the liquid crystal device. Also, the image capture period can be predicted from the calculation result. Moreover, the period in which light in an amount sufficient for measurement can be projected can be calculated, so a synchronization timing can be generated so as to capture an image within the period in which measurement is possible. The lower part of
A CCD sensor implemented by CCDs includes two dimensionally arrayed photodiodes, vertical CCDs, horizontal CCDs, and output amplifiers. A charge photoelectrically converted by the photodiode is sequentially transferred via the vertical CCD and horizontal CCD, converted into a voltage by the output amplifier, and output as a voltage signal. Note that since the charges stored in the vertical CCDs and horizontal CCDs function as one frame memory, the exposure time and the readout time can be separated to allow full-pixel simultaneous exposure. On the other hand, a CMOS sensor includes a photodiode, a horizontal/vertical MOS switch matrix, horizontal and vertical scanning circuits which sequentially scan horizontal and vertical lines, respectively, and an output amplifier. A charge photoelectrically converted by the photodiode turns on the vertical MOS switch as a shift pulse from the vertical scanning circuit reaches it, and turns on the horizontal MOS switch as a shift pulse from the horizontal scanning circuit reaches it.
At this time, because of the horizontal/vertical matrix structure, when both the horizontal and vertical switches are turned on, the charge of the photodiode at the position corresponding to these switches is directly connected to the output amplifier, converted into a voltage, and output as a voltage signal. Note that a charge can be stored only in the photodiode, so the rolling shutter scheme in which sensor information is sequentially output for each horizontal line is adopted. As a special case, a CMOS sensor which adopts the global shutter scheme upon being additionally equipped with a memory function is available, but is expensive due to its complex internal configuration and large circuit scale. Therefore, a rolling shutter CMOS sensor is commonly used in that case.
However, the rolling shutter scheme cannot capture an image by full-field simultaneous exposure, and therefore use respective lines in different states at the same time instant. This makes it very difficult to use the rolling shutter scheme in image capture for measurement.
The upper part of
As can be seen from
For this reason, the exposure time, transfer time, and horizontal blanking time on one horizontal line are the same in all horizontal lines, but the image capture start time varies in each individual line, so the time elapsed from the start of exposure of each horizontal line varies depending on the image capture position at the same time instant, and one of the exposure state, the transfer state, and the horizontal blanking state may mix with another state. An image capture device has such image capture characteristics. The lower part of
For three dimensional measurement using a projection device and image capture device having such image capture characteristics, it is necessary to ensure a sufficient period in which the projection luminance is constant on the projection side, and to use a plurality of frames so as to ensure a time sufficient to complete image capture on the image capture side during the period in which the projection luminance is constant.
In this embodiment, the synchronization control unit 104 more accurately, appropriately adjusts the synchronization timing between projection and image capture to attain high-speed measurement for each frame.
The I/O unit 301 receives light source ON information and synchronization timing information for synchronization between the projection timing and the image capture timing from the overall control unit 101.
The control unit 302 stores the synchronization timing information received by the I/O unit 301 in the synchronization timing lookup table 303, and stores the light source ON information received by the I/O unit 301 in the ON timing lookup table 307.
The synchronization detection unit 304 receives a signal associated with synchronization from the projection unit 102, and detects a synchronization signal (more specifically, a signal Vsync) required for synchronization.
The synchronization timing generation unit 305 outputs a synchronization timing signal serving as a synchronization generate command to the synchronization generation unit 306 when the timing to generate a synchronization signal comes, based on the synchronization timing information read out from the synchronization timing lookup table 303 for the synchronization signal detected by the synchronization detection unit 304. In response to the synchronization timing signal, the synchronization generation unit 306 generates a synchronization signal that can be recognized as an external trigger signal by the image capture unit 103, and sends it to the image capture unit 103.
The ON period generation unit 308 generates an ON signal from the ON timing information, read out from the ON timing lookup table 307, using the synchronization timing signal generated by the synchronization timing generation unit 305, and outputs it to the light source unit 102-1 of the projection unit 102.
In this manner, both the image capture timing of the image capture unit 103 and the ON timing of the light source unit 102-1 of the projection unit 102 can be controlled in synchronism with the projection timing of the projection unit 102.
Referring to
Note that the left part of
A duration s1 is the start time of effective pattern projection delayed due to factors associated with the line-sequential driving scheme and the rise characteristics of the display device of the projection unit 102, and a duration s2 is the end time of image capture. The duration obtained by subtracting the duration s1 from the duration s2 is an ON period 403 of the light source unit.
A method of calculating a timing which allows efficient, high-speed projection and image capture by synchronizing the projection operation and the image capture operation in order to set a minimum ON period 403 will be described below.
First, to accurately measure the edge position of the projection pattern, it is necessary to set the resolution of the image capture unit 103 higher than that of the projection unit 102. When the resolution of the image capture unit 103 is p times that of the projection unit 102,
the number of horizontal pixels m of the image capture unit 103 and the number of horizontal pixels n of the projection unit 102 have a relation given by:
n=m×p, and
the number of vertical lines N of the image capture unit 103 and the number of vertical lines M of the projection unit 102 have a relation given by:
N=M×p(L+N=M×p in ROI control, where L is the start line of ROI control)
Although the projection unit 102 can adopt the frame-sequential driving scheme or the line-sequential scheme as its display scheme, the following description assumes a line-sequential driving liquid crystal device, timing adjustment of which is difficult.
As performance characteristic information unique to the display device, the offset time (to be symbolized by “Hp_st” hereinafter) until projection for each line becomes effective is used. The offset time is the time taken for the display device to become a projection state in which measurement is possible after the start of projection, and depends on the response property of rise of the display device.
As another performance characteristic information unique to the display device, the time variation (to be symbolized as “ΔHp hereinafter) for each line is used. In the line-sequential driving scheme, a predetermined variation in projection start time occurs in each individual line, and the degree of variation is determined depending on the active matrix driving scheme of the liquid crystal projector, and the circuit configuration of, for example, a line buffer.
As still another performance characteristic information unique to the display device, the effective projection time (to be symbolized as “Hp” hereinafter) for each line is used. The effective projection time is, for example, the period in which the brightness is 80% or more when the projection pattern is a white pattern, or that in which the brightness is 20% or less when the projection pattern is a black pattern, and depends on the response property of the display device. Based on these pieces of performance characteristic information unique to the display device,
the time variation of the Mth line of the projection unit 102 can be calculated as:
ΔHp×M
the effective projection start time of the Mth line of the projection unit 102 can be calculated as:
Hp
—
st+ΔHp×M, and
the effective projection end time of the Mth line of the projection unit 102 can be calculated as:
Hp
—
st+ΔHp×M+Hp
Although the image capture unit 103 can adopt the global shutter scheme or the rolling shutter scheme as its image capture scheme, the following description assumes a rolling shutter CMOS image capture device, timing adjustment of which is difficult.
As performance characteristic information unique to the rolling shutter image capture device, the pixel speed (to be symbolized as “f” hereinafter), for example, is used. The pixel speed f is the speed at which sensor information is output from the image sensor.
As another performance characteristic information unique to the rolling shutter image capture device, the time variation (to be symbolized as “ΔHs” hereinafter) for each line is used. The time variation ΔHs is the time (including the blanking period) taken for sensor information of one horizontal line to be transferred to an external device.
Pieces of performance characteristic information including the number of horizontal pixels n for each line, the horizontal blanking count (to be symbolized as “bk” hereinafter) for each line, and the exposure time for each line are parameters that can be freely changed and set by the operator within the tolerance of the image capture unit 103. Based on these parameters, the time variation ΔHs for each line can be calculated as ΔHs=(n+bk)×f, and the process time for each line can be calculated as Hs+ΔHs.
The case wherein the image capture unit 103 performs partial image capture by ROI control will be considered. The offset time (to be symbolized as “ROI_st” hereinafter) of the ROI control start line L is added as a parameter that can arbitrarily be set and changed. To start image capture by ROI control, projection must be started before the start of image capture on a projection line corresponding to a position identical to the start line position of ROI control. This means that the offset time ROI_st of the ROI control start line L depends on factors associated with the projection side. Hence, from the effective projection start time of the Mth line on the projection side (Hp_st+ΔHp×M), the number of vertical lines (M=N/p), and N=L, we have ROI_st=Hp_st+ΔHp×(L/P). Based on this relation,
the time variation of the Nth line of the image capture unit 103 (addition of the offset time ROI_st of the ROI control start line L) can be calculated as:
ROI—st+ΔHs×N{Hp—st+ΔHp×(L/p)}+ΔHs×N
the image capture start time of the Nth line of the image capture unit 103 (addition of the offset time ROI_st of the ROI control start line L) can be calculated as:
ROI—st+ΔHs×N{Hp—st+ΔHp×(L/p)}+ΔHs×N, and
the image capture end time of the Nth line of the image capture unit 103 (addition of the offset time ROI_st of the ROI control start line L) can be calculated as:
ROI—st+ΔHs×N+Hs{Hp—st+ΔHp×(L/p)}+ΔHs×N+Hs
Using the projection time information of the projection unit 102, and the image capture time information of the image capture unit 103, the conditions in which the projection and image capture timings are appropriately adjusted and controlled are set as follows. To perform exposure of the image capture unit 103 within the effective projection period of the projection unit 102, it is necessary to satisfy the following two conditions.
First, as condition 1, it is necessary to start the exposure operation of the image capture unit 103 after the effective projection start time of the projection unit 102. This means that the exposure start time of the image capture unit 103 must be set to be after the effective projection start time of the projection unit 102. That is, it is necessary to satisfy relations:
(the image capture start time of the Nth line)≧(the effective projection start time of the Mth line)
ROI—st+ΔHs×N≧Hp—st+ΔHp×{(L+N)/p} for M=(L+N)/p
{Hp—st+ΔHp×(L/p)}+ΔHs×N≧Hp—st+ΔHp×((L+N)/p)
ΔHs×N−ΔHp×(N/p)≧Hp—st+ΔHp×(L/p)−Hp—st−ΔHp×(L/p)
N≧0
Next, as condition 2, it is necessary to end the exposure operation of the image capture unit 103 before the effective projection end time of the projection unit 102. This means that the exposure end time of the image capture unit 103 must be set to be the same as or earlier than the effective projection end time of the projection unit 102. That is, it is necessary to satisfy relations:
(the image capture end time of the Nth line)≧(the effective projection end time of the Mth line)
ROI—st+ΔHs×N+Hs≦Hp—st+ΔHp×M+Hp
{Hp—st+ΔHp×(L/p)}+ΔHs×N+Hs≦Hp—st+ΔHp×((L+N)/p)+Hp for M=(L+N)/p
Hp
—
st+ΔHp×(L/p)+ΔHs×N+Hs≦Hp—st+ΔHp×((L+N)/p)+Hp
N≦(Hp−Hs)/(ΔHs−ΔHp/p)
N≦(Hp−Hs)/((n+bk)×f−ΔHp/p) for ΔHs=(n+bk)×f
As can be seen from the above-mentioned relations, to satisfy both conditions 1 and 2, it is necessary to satisfy a relation:
(Hp−Hs)/((n+bk)×f−ΔHp/p)≧N≧0 (1)
Relation (1) indicates that the time obtained by subtracting the exposure time Hs of the image capture unit 103 from the effective projection time Hp of the projection unit 102 for each line is the moratorium period in which the difference in time generated between projection scanning and image capture scanning within one frame period can be absorbed, and the value obtained by dividing this moratorium period by the difference in time for each line generated between projection scanning and image capture scanning becomes the maximum number of vertical lines Nmax of image capture.
Note that Hp, f, and ΔHp/p are constants, while n, bk, Hs, and N are setting parameters, so the number of horizontal pixels n of image capture, the blanking count bk, the exposure time Hs of the image capture unit 103, and the number of vertical lines N of image capture in the ROI control area are determined so as to satisfy the condition presented in relation (1), thereby appropriately adjusting the projection and image capture timings.
By setting the ROI control start line L, the image capture start time s1 can be calculated as:
s1=Hp—st+ΔHp×(L/p) for M=L
Then, by setting the number of horizontal pixels n, the blanking count bk, and the number of vertical lines N in the ROI control area so as to satisfy relation (1), the image capture end time s2 can be calculated as:
Although the case wherein the image capture unit 103 performs ROI control and image capture has been described above, two cases wherein the image capture unit 103 does not perform ROI control are assumed as follows:
case 1 wherein image capture of the entire field starts simultaneously with the start of projection without ROI control, and
case 2 wherein image capture of the entire field starts with a delay corresponding to the offset value Hs_st from the start of projection without ROI control.
In each of cases 1 and 2, as in the case wherein ROI control is performed,
the time variation of the Nth line of the image capture unit 103 can be calculated as:
ΔHs×N (case 1)
Hs
—
st+ΔHs×N (case 2)
the image capture start time of the Nth line of the image capture unit 103 can be calculated as:
ΔHs×N (case 1)
Hs
—
st+ΔHs×N, and (case 2)
the image capture end time of the Nth line of the image capture unit 103 can be calculated as:
ΔHs×N+Hs (case 1)
Hs
—
st+ΔHs×N+Hs (case 2)
These parameters are similarly applied to conditions 1 and 2. First, as condition 1, it is necessary to start the exposure operation of the image capture unit 103 after the effective projection start time of the projection unit 102. This means that the exposure start time of the image capture unit 103 must be set to be after the effective projection start time of the projection unit 102. That is, it is necessary to satisfy relations:
(the image capture start time of the Nth line)≧(the effective projection start time of the Mth line)
ΔHs×N≧Hp—st+ΔHp×(N/p)
N≧Hp
—
st/((n+bk)×f−ΔHp/p) for ΔHs=(n+bk)×f (case 1)
Hs
—
st+ΔHs×N≧Hp
—
st+ΔHp×(N/p)
N≧(Hp—st−Hs—st)/((n+bk)×f−ΔHp/p) for ΔHs=(n+bk)×f (case 2)
Next, as condition 2, it is necessary to end the exposure operation of the image capture unit 103 before the effective projection end time of the projection unit 102. This means that the exposure end time of the image capture unit 103 must be set to be the same as or earlier than the effective projection end time of the projection unit 102. That is, it is necessary to satisfy relations:
(the image capture end time of the Nth line)≧(the effective projection end time of the Mth line)
ΔHs×N+Hs≦Hp—st+ΔHp×(N/p)+Hp
N≦(Hp—st+Hp−Hs)/((n+bk)×f−ΔHp/p) for ΔHs=(n+bk)×f (case 1)
Hs
—
st+ΔHs×N+Hs≦Hp
—
st+ΔHp×(N/p)+Hp
N≦(Hp—st+Hp−Hs—st−Hs)/((n+bk)×f−ΔHp/p) for ΔHs=(n+bk)×f (case 2)
As can be seen from the above-mentioned relations, to satisfy both conditions 1 and 2, it is necessary to satisfy relations:
(case 1)
(Hp—st+Hp−Hs)/((n+bk)×f−ΔHp/p)≧N≧Hp—st/((n+bk)×f−ΔHp/p) for N≧0 (2)
(case 2)
(Hp—st+Hp−Hs—st−Hs)/((n+bk)×f−ΔHp/p)≧N≧(Hp—st−Hs—st)/((n+bk)×f−ΔHp/p) for N≧0 (3)
This means that in case 1, the time obtained by subtracting the exposure time Hs of the image capture unit 103 from the effective projection time Hp of the projection unit 102, and adding the projection start offset value Hp_st from the difference is the moratorium period in which the difference in time generated between projection scanning and image capture scanning within one frame period can be absorbed, and the value obtained by dividing this moratorium period by the difference in time for each line generated between projection scanning and image capture scanning becomes the maximum number of vertical lines of image capture, which satisfies relation (2). Hence, an image capture area defined up to this number of vertical lines can be captured. In addition, the projection start offset time Hp_st is the period in which the image capture unit 103 stands by for image capture using the difference in time generated between projection scanning and image capture scanning within one frame period, and the value obtained by dividing this period by the difference in time for each line generated between projection scanning and image capture scanning becomes the minimum number of vertical lines of image capture, which satisfies relation (2). Hence, an image capture area defined from this number of vertical lines can be captured. That is, from the above-mentioned two conditions, relation (2) indicates that an image capture area defined from the minimum number of vertical lines to the maximum number of vertical lines is effective.
On the other hand, in case 2, the time obtained by adding the difference obtained by subtracting the exposure time Hs of the image capture unit 103 from the effective projection time Hp of the projection unit 102, and the difference obtained by subtracting the image capture start offset time Hs_st from the projection start offset time Hp_st is the moratorium period in which the difference in time generated between projection scanning and image capture scanning within one frame period can be absorbed, and the value obtained by dividing this moratorium period by the difference in time for each line generated between projection scanning and image capture scanning becomes the maximum number of vertical lines of image capture, which satisfies relation (3). Hence, an image capture area defined up to this number of vertical lines can be captured. In addition, the projection start offset time Hp_st is the period in which the image capture unit 103 stands by for image capture using the difference in time generated between projection scanning and image capture scanning within one frame period, and the value obtained by dividing this period by the difference in time for each line generated between projection scanning and image capture scanning becomes the minimum number of vertical lines of image capture, which satisfies relation (3). Hence, an image capture area defined from this number of vertical lines can be captured. That is, from the above-mentioned two cases, relation (3) indicates that an image capture area defined from the minimum number of vertical lines to the maximum number of vertical lines is effective.
Note that Hp, f, and ΔHp/p are constants, while n, bk, Hs, N, and Hs_st are setting parameters, so the number of horizontal pixels n of image capture, the blanking count bk, the exposure time Hs of image capture, the number of vertical lines N of image capture in the ROI control area, and the offset value Hs_st from the start of projection are determined so as to satisfy the condition presented in relation (2) or (3), thereby appropriately adjusting the projection and image capture timings.
The image capture start time s1 and image capture end time s2 can be calculated as:
s1=Hp—st
s2=Hp—st+(n+bk)×f×N+Hs (case 1)
s1=Hs—st
s2=Hp—st+(n+bk)×f×N+Hs (case 2)
With the above-mentioned operation, the image capture parameters determined in the above-mentioned way are set for the image capture unit 103, and the image capture start timing of the synchronization control unit 104 is set to the value calculated in the above-mentioned way. Then, the synchronization control unit 104 outputs an external trigger signal to the image capture unit 103 with a delay corresponding to the set value relative to a projection synchronization signal, and the image capture unit 103 captures an image in synchronism with the output trigger signal. This allows projection and image capture for each frame and, in turn, low-cost, high-speed three dimensional measurement even for a combination of a line-sequential driving projection unit 102 and a rolling shutter image capture unit 103.
Case 1 corresponds to a combination of display of a line-sequential driving projection unit 102 and a rolling shutter image capture unit 103 in the left part of
As described above, according to this embodiment, a variable ON/OFF light source is used as the light source of the projection unit to synchronize the projection unit and the image capture unit based on, for example, the time variation between the projection unit and the image capture unit for each line, the response property of the display device, the image capture area information (the ROI size and position), and the projection pattern. Also, the light source is kept ON only during the period required for measurement, thereby constructing a three dimensional measurement apparatus including a projection unit excellent in terms of the use efficiency of the light source. Moreover, the ON time of the light source is shortened to prevent an increase in temperature of the light source, thereby allowing life prolongation and power saving of the light source.
The synchronization timing between projection and image capture when a partial area of the entire area is measured in the second embodiment will be described with reference to
Note that
The right part of
On the other hand,
The right part of
The synchronization timing between projection and image capture when a light source is kept ON only during the period corresponding to the white portion (high luminance portion) of the projection pattern in the third embodiment will be described with reference to
As in the cases of
The right part of
The synchronization timing between projection and image capture when the ON period is set longer than the image capture period will be described in this embodiment. The reason why the ON period is set longer than the image capture period will be explained first with reference to
First, in measuring a target object having a given depth in three dimensional measurement, projection and image capture of the measurement distance are performed at positions shifted to the front or rear from a reference position within the measurement tolerance in the depth direction. In this case, even if the projection area coincides with the image capture area at the reference position, the projection area shifts to the projection side and narrows at a position more to the front than the reference position, so an area 801 to which the projection pattern is not projected is partially generated in the image capture area on the image capture side at that position. In contrast to this, the projection area shifts to the projection side and widens at a position more to the back than the reference position, but an area 802 to which the projection pattern is not projected is partially generated in the image capture area on the image capture side at that position. For this reason, it is necessary to set the ON period longer than the image capture period so as to allow measurement by projection to the area 802.
The synchronization timing between projection and image capture when the ON period is set longer than the image capture period in the fourth embodiment will be described with reference to
A configuration which obtains the characteristic information of a projection unit and image capture unit to generate a synchronization timing between projection and image capture will be described in this embodiment.
The configuration of a three dimensional measurement apparatus in the fifth embodiment will be described with reference to
The projection/image capture performance characteristic information storage unit 101-5 stores the performance characteristic information of the projection unit and image capture unit in advance. The projection/image capture synchronization information generation unit 101-6 is configured to read out required performance characteristic information from the projection/image capture performance characteristic information storage unit 101-5, generate synchronization information from the readout performance characteristic information using the calculation method described in the first embodiment, and output the synchronization information to a synchronization control unit 104 as needed.
Therefore, when the specification of either the projection unit or the image capture unit is changed, the performance characteristic information of the changed part is newly input to and stored in the projection/image capture performance characteristic information storage unit 101-5, thereby automatically generating a synchronization timing based on the stored performance characteristic information. This makes it possible to easily cope with a change in specification of either the projection unit or the image capture unit.
The internal configuration of a light source unit 102-1 of a light source unit 102-1 will be described with reference to
The light source unit 102-1 shown in
In the case of
According to the present invention, the ON time of the light source is shortened to prevent an increase in temperature of the light source, thereby allowing life prolongation and power saving of the light source.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (for example, computer-readable storage medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2011-277658 filed on Dec. 19, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2011-277658 | Dec 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/079444 | 11/7/2012 | WO | 00 | 5/29/2014 |