The present application claims priority to Japanese Patent Applications number 2021-103903, filed on Jun. 23, 2021. The contents of this applications are incorporated herein by reference in their entirety.
The present disclosure relates to an optical sensor and a geometry measurement apparatus.
In a geometry measurement apparatus, a non-contact type of optical sensor is used to measure a cross-sectional shape of an object to be measured using a light section method based on a triangulation principle. The optical sensor irradiates the object to be measured with line shaped light, and captures an image of the object to be measured on the basis of light reflected from a surface of the object to be measured (see Japanese Patent No. 5869281).
In the optical sensors, the line shaped light in a straight line is radiated to the object to be measured, but due to an error caused by a lens component included in the optical sensor or the like, distribution of the line shaped light on the surface of the object to be measured may be undulating instead of straight. In this case, an imaging part captures an undulating image, resulting in an error in the measurement of the geometry of the object to be measured.
The present disclosure focuses on this point, and an object of the present disclosure is to suppress a measurement error when an object to be measured is measured by radiating line shaped light thereto.
A first aspect of the present disclosure provides an optical sensor including a radiation part that irradiates an object to be measured with line shaped light, and an imaging part that receives line shaped light reflected by the object to be measured and captures an image of the object to be measured in a predetermined exposure time, wherein the radiation part includes a light generation part that generates the line shaped light, and a light vibration part that irradiates the object to be measured with the line shaped light generated by the light generation part while vibrating the line shaped light in a length direction during the exposure time.
A configuration of an optical sensor according to the first embodiment will be described with reference to
The optical sensor 10 is used to measure a cross-sectional shape of an object to be measured W at the light-section plane (in
The radiation part 20 irradiates the object to be measured W with the line shaped light L. Specifically, the radiation part 20 deforms laser light into the line shaped light L and irradiates the object to be measured W with the line shaped light L. As shown in
The light source 22 is formed by a Laser Diode (LD) or the like, for example, and generates and emits the laser light. The light source 22 emits the laser light with a predetermined wavelength.
The collimator lens 24 collimates the laser light emitted from the light source 22. The collimator lens 24 is a convex lens in this embodiment.
The cylindrical lens 26 deforms parallel light (laser light) from the collimator lens 24 into the line shaped light L having a line shape. In the present embodiment, the cylindrical lens 26 corresponds to a light generation part that generates the line shaped light L.
An image forming lens 30 forms an image of the line shaped light L, which is reflected light reflected by the object to be measured W, on an imaging surface of the imaging part 40. The image forming lens 30 here is a convex lens.
The imaging part 40 is an image sensor such as a CMOS, for example, and captures the image of the object to be measured W. The imaging part 40 receives the line shaped light L reflected by the object to be measured W, and captures the image of the object to be measured W in a predetermined exposure time. That is, the imaging part 40 captures an image of light distribution indicating the cross-sectional shape of the object to be measured W at the light-section plane. As shown in
Incidentally, although the line shaped light Lin a straight line is radiated to the object to be measured W, due to an error or the like caused by a lens component included in the optical sensor 10, distribution of the line shaped light L on the surface of the object to be measured W may be undulating instead of straight. Specifically, the distribution of the line shaped light L undulates in the normal direction of the light-section plane. In this case, the imaging part 40 captures an undulated image, resulting in an error in the measurement of the geometry of the object to be measured W.
In contrast, in the optical sensor 10 of the present embodiment, the radiation part 20 is provided with the light vibration part 50 in order to suppress the measurement error. The light vibration part 50 vibrates the line shaped light L radiated to the object to be measured W, and averages out the undulation of the line shaped light L in the normal direction of the light-section plane. Specifically, the light vibration part 50 irradiates the object to be measured W with the line shaped light L while vibrating the line shaped light L in the length direction during the exposure time of the imaging part 40. Thus, the imaging part 40 captures the image of the line shaped light L vibrating during the exposure time, and the image formed on the imaging surface of the imaging part 40 has averaged-out undulation in the normal direction.
The light vibration part 50 irradiates the object to be measured W with the line shaped light L while causing the line shaped light L to make one reciprocation in the length direction during the exposure time of the imaging part 40. It should be noted that the present disclosure is not limited to the above, and the light vibration part 50 may irradiate the object to be measured W with the line shaped light L while causing the line shaped light L to reciprocate a plurality of times in the length direction during the exposure time of the imaging part 40. That is, the light vibration part 50 reciprocates the line shaped light at least once in the length direction during the exposure time. This makes it easier to average out random undulations in the normal direction of the line shaped light L.
The light vibration part 50 irradiates the object to be measured W with the line shaped light L having a predetermined cycle in the length direction while vibrating the line shaped light L such that the line shaped light L is shifted by ½ or more of the cycle. For example, the light vibration part 50 vibrates the line shaped light L such that the line shaped light L is shifted by ½ of a cycle T shown in
As shown in
The plane mirror 52 reflects the line shaped light L from the cylindrical lens 26 toward the rocking mirror 54. Here, the plane mirror 52 reflects the line shaped light L by 90°. The plane mirror 52 is a fixed mirror.
The rocking mirror 54 is a mirror that directs the line shaped light L reflected from the plane mirror 52 toward the object to be measured W. Here, the rocking mirror 54 reflects the line shaped light L vertically downward. The rocking mirror 54 rocks to vibrate the line shaped light L directed toward the object to be measured W. The rocking mirror 54 rocks about an axis C (see
The sensor controller 70 controls an operation of the optical sensor 10. The sensor controller 70 controls the radiation of the laser light by the radiation part 20 and the capturing of the image of the object to be measured W by the imaging part 40.
The sensor controller 70 controls the vibration of the line shaped light L by the light vibration part 50. For example, the sensor controller 70 rocks the rocking mirror 54 of the light vibration part 50 at high speed to vibrate the distribution of the line shaped light L at high speed in the length direction. Further, the sensor controller 70 controls the exposure of the imaging part 40 and the vibration of the line shaped light L in the length direction by the light vibration part 50 such that they are synchronized with each other. For example, the sensor controller 70 controls the operations of the imaging part 40 and the light vibration part 50 such that the conditions of the exposure time of the imaging part 40 and the rocking angle of the rocking mirror 54 of the light vibration part 50 are constant. Thus, the imaging part 40 can capture the image of the object to be measured W when the line shaped light L vibrates.
A configuration of a geometry measurement apparatus 1 including the optical sensor 10 having the above-described configuration will be described with reference to
Since the configuration of the optical sensor 10 is as described above, a detailed description thereof will be omitted here. The moving mechanism 80 moves the optical sensor 10. For example, the moving mechanism 80 moves the optical sensor 10 in three axial directions orthogonal to each other.
The control apparatus 90 controls the operation of the optical sensor 10 (specifically, the radiation part 20, the imaging part 40, and the light vibration part 50) and the moving mechanism 80. Further, the control apparatus 90 performs the measurement using the optical sensor 10 by moving the optical sensor 10 with the moving mechanism 80, for example. The control apparatus 90 includes a storage 92 and a control part 94.
The storage 92 includes a Read Only Memory (ROM) and a Random Access Memory (RAM), for example. The storage 92 stores various types of data and a program executed by the control part 94. For example, the storage 92 stores a result of the measurement by the optical sensor 10.
The control part 94 is a Central Processing Unit (CPU), for example. The control part 94 executes the program stored in the storage 92 to control the operation of the optical sensor 10 via the sensor controller 70. Specifically, the control part 94 controls the radiation of the laser light to the object to be measured W by the light source 22 of the radiation part 20. Further, the control part 94 acquires an output of the imaging part 40 and calculates the geometry of the object to be measured W. In the present embodiment, the control part 94 functions as a calculation part that calculates the geometry of the object to be measured W on the basis of the output of the imaging part 40.
In the optical sensor 10 of the first embodiment, the radiation part 20 includes the light vibration part 50 that irradiates the object to be measured W with the line shaped light L while vibrating the line shaped light L in the length direction during the exposure time of the imaging part 40.
Thus, even if the line shaped light L undulates in the normal direction on the surface of the object to be measured W due to an error or the like caused by the lens component of the optical sensor 10, the image formed on the imaging surface of the imaging part 40 will have the averaged-out undulation since the imaging part 40 captures the vibrating line shaped light L during the exposure time. As a result, it is possible to suppress the measurement error of the object to be measured W caused by the undulation of the line shaped light L in the normal direction.
In the second embodiment, the configuration of the light vibration part 50 is different from that in the first embodiment, and the other configurations are the same as those in the first embodiment.
The actuator 60 reciprocates the radiation part 20 and the light source 22 in the length direction of the line shaped light L. The light source 22 is reciprocated between a first position shown in
Also in the second embodiment, during the exposure time of the imaging part 40, the light vibration part 50 reciprocates the light source 22 in the length direction of the line shaped light L using the actuator 60 to vibrate the line shaped light L in the length direction. Therefore, the imaging part 40 captures the image of the object to be measured W when the line shaped light L vibrates. Thus, even if the line shaped light L undulates in the normal direction on the surface of the object to be measured W, the image formed on the imaging surface of the imaging part 40 will have the averaged-out undulation. As a result, it is possible to suppress the measurement error of the object to be measured W caused by the undulation in the normal direction of the line shaped light L.
In the above description, the actuator 60 reciprocates the light source 22 to vibrate the line shaped light L, but the present disclosure is not limited thereto. The actuator 60 may reciprocate the collimator lens 24 and the cylindrical lens 26 in the length direction of the line shaped light L instead of the light source 22. For example, like the light source 22 shown in
Also in the variation, the light vibration part 50 reciprocates the collimator lens 24 and the cylindrical lens 26 using the actuator 60 during the exposure time of the imaging part 40, and the line shaped light L vibrates in the length direction. Thus, the imaging part 40 captures the image of the object to be measured W when the line shaped light L vibrates.
In the third embodiment, the configuration of the light vibration part 50 is different from that in the first embodiment, and the other configurations are the same as those in the first embodiment.
The rotating mirror 65 rotates in a direction of an arrow shown in
Also in the third embodiment, the light vibration part 50 rotates the rotating mirror 65 during the exposure time of the imaging part 40 to vibrate the line shaped light L in the length direction. Therefore, the imaging part 40 captures the image of the object to be measured W when the line shaped light L vibrates. Thus, even if the line shaped light L undulates in the normal direction on the surface of the object to be measured W, the image formed on the imaging surface of the imaging part 40 will have the averaged-out undulation. As a result, it is possible to suppress the measurement error of the object to be measured W in the normal direction.
The present invention is explained on the basis of the exemplary embodiments. The technical scope of the present invention is not limited to the scope explained in the above embodiments and it is possible to make various changes and modifications within the scope of the invention. For example, all or part of the apparatus can be configured with any unit which is functionally or physically dispersed or integrated. Further, new exemplary embodiments generated by arbitrary combinations of them are included in the exemplary embodiments of the present invention. Further, effects of the new exemplary embodiments brought by the combinations also have the effects of the original exemplary embodiments.
Number | Date | Country | Kind |
---|---|---|---|
2021-103903 | Jun 2021 | JP | national |